Search for content, post, videos

Data integrity keeps life-saving research and information safe

Quality and safety in drugs production is the overriding concern. Now regulators are tightening their grip on the integrity of all the data produced.

In the pharmaceutical industry, regulations require close monitoring of the manufacturing and logistics of drugs to guarantee their quality, safety and efficacy. This means numerous measurements of different kinds that produce an ever-growing amount of data. This information has to be kept safe, available, and traceable.

However, the guidance provided by the regulators up till now has not focused on data integrity, and there have been inherent problems with data collection and storage not addressed.

Targeting data integrity

In fact, regulators have found gaps between industry practice and existing technology. Many companies have misinterpreted the earlier guidance, or do not know how to apply its demands to their systems, and in effect do not achieve sufficient data integrity.

Publication of the likes of GMP non-compliance reports, warning letters, import alerts, and notices has made it clear that regulators are targeting data integrity failures in their current inspections.

Subsequent enforcement actions have led to the withdrawal of supply across multiple markets, product recalls, and consent decrees. They have also damaged the reputation of the companies involved, to a varying degree.

With increased targeting of data integrity from regulators, it is crucial that everyone in the business understand correct data management practices.

Correct, traceable, reliable

The basic principle of data integrity is simple: all data collected and stored must be correct, traceable, and reliable.

The Food and Drug Administration (FDA) in the US, Medicines and Healthcare products Regulatory Agency (MHRA) in the UK; and the World Health Organization use an acronym to make it easier to remember the expectations on records.

ALCOA stands for
A = attributable to the person generating the data
L = legible and permanent
C = contemporaneously recorded
O = original or a true copy
A = accurate

WHO has extended these requirements to ALCOA+. The plus stands for the attributes of complete, consistent, enduring, and available.

Today, ALCOA+ is the goal for every piece of information that can impact the purity, efficacy, and safety of products. It is also the standard by which data will be evaluated. In practice, this means that company must maintain control over changes – intentional and unintentional – to its data.

Paper or bits – all the same

The same requirements apply to all records, be they on paper, stored electronically, or both.

When data is gathered and stored manually, there are obvious points of possible failure: operators can forget to record information, write down wrong data by mistake, lose records, or even intentionally falsify data.

One might think that automation would solve these types of problems once and for all, and to a large extent it does. But computerized systems have their own, more technical problems.

For example, one German company received an FDA Warning Letter due to data integrity failures in its system. The failures included e.g. insufficient controls to prevent unauthorized access or changes to data; lacking access controls and audit trail capabilities; one user name and administrator rights for all employees; and the possibility to manipulate or delete electronic data.

Each of these deviations could have been addressed by systems and methods, including unique usernames and passwords, inerasable audit trail or event log, separate administrator and user access rights, good standard operating procedures, oversight and regular overview of processes.

Systems need to match risks

To maintain data integrity in their systems, organisations need to focus on different functions and knowledge areas. Below are some of the most important aspects.

The first is quality risk management, which is based on the company understanding how its data impacts product quality and patient safety. After that, it should understand the technologies used in the data processes as well as their limitations.

The systems implemented must provide an acceptable state of control that matches process criticality and risks. Any points where unauthorized deletion or amendment of data is possible should be identified and documented, and risk assessments to tackle those issues scheduled and performed regularly.

Internal audits need to be put in place for identification of potential problems, with detailed review processes for data integrity issues. In-house data audits should be performed routinely on audit trails, raw data and metadata, and original records. System access rights also need to be checked regularly.

During the lifecycle of data, change management and control of incidents and deviations is required, as well as corrective and preventive action (CAPA) processes and procedures.

All this has to be documented, using good documentation practices and following requirements in regulations for computerized systems.

Competence and culture

The second key area is personnel. All those involved in data gathering, processing, and storing should have clearly defined roles and responsibilities, with appropriate access rights and privileges for each system. Responsibility for data throughout its entire lifecycle should also be assigned unambiguously.

Data integrity is also a question of corporate culture. Encouraging a culture that supports issue reporting and rewarding proper conduct are key items. Any compliance failures need to be analyzed thoroughly to fix their root causes systematically.

Regular training of personnel is required, and the training needs to be matched to different roles, including quality assurance, quality control, production, and management of data.

But the scope does not end with the company’s own organization. It needs to ensure its system manufacturers and service providers do not jeopardize data integrity in any way, but have qualified and trained personnel, quality management systems in place, and comply with standards, such as ISO 9001 or 17025.

The systems and services of vendors and providers also need to be checked regularly, and audited when possible.

Right provider paramount

When the framework has been set up, there are practical tools for ensuring data integrity in computerized systems.

This starts with selecting the right system and service providers. Not only should they be fluent with all the relevant regulations, but they should also have the right kind of organizational culture to be entrusted with the data. The systems must be fit-for-purpose, and the provider should be able to provide proof of the software’s suitability for the application. The provider must also have included dedicated functions in their system to ensure data integrity.

Here it is important to remember that expertise and knowhow may cost a bit more, but will eventually save time and money.

Also, the organization’s own IT environment should be fully qualified. Data and its metadata need to be saved and backed up to a secure location regularly, and the ability to retrieve the backups has to be verified during internal audits.

Risk defines validation depth

The risk level of the system defines the depth of validation required: the higher the risk, the deeper the validation. Only systems that are part of GxP compliance are to be validated.

In some cases, it is cost-effective to have the system vendor perform qualification and validation of the systems. To help decide between in-house or purchased validation service, ISPE’s GAMP5 (Good Automated Manufacturing Practice) categorizations can be used to determine the validation complexity of your system.

The validation protocols need to address data quality and reliability, and during validation, all electronic data storage locations, including printouts and PDF reports, need to be accounted for.

The company’s quality management system has to define the frequency, roles and responsibilities in system validation, and the validation master plan must outline the approach to use to review meaningful metadata, including audit trails, etc.

After the initial validation, re-evaluations of the systems need to be done periodically, according to a pre-set schedule.

Audit Trail shows compliance

The system must have an audit trail – an inerasable record of all data in a system, including any changes that have been made to a database or file. The audit trail must answer the questions of who, what, when and why. This information is crucial in GxP compliance.

The first step is to define the data relevant to GxP compliance and including it in an audit trail.

Clear roles and schedules are needed in the testing of the audit trail functionality. The depth of the review is based on the complexity of the system and its intended use.

Audit trails comprise of discrete event logs, history files, database queries, reports or other mechanisms that display events related to the system, electronic records or raw data contained within the record.

Preparing for changes – and the worst

Regulation and compliance requirements are not set in stone, so systems need to be able to change with them. Hence, it is wise to select systems that are easy to update upon the addition of new hardware or other system inputs.

To provide for power outages or network downtime, it is advisable to use software and systems that can record and store data redundantly. Solutions such as UPS (Uninterrupted Power Source), battery-powered, standalone recorders or devices that can switch to an alternate power source when required are also useful. E.g. data loggers could be battery powered.

The system’s software updates need to be designed to comply with changing regulations, especially when implementing new features. Collaboration with providers helps in staying informed about changes and software updates, so the systems can be updated accordingly.

Data integrity investigations should also include business continuity aspects, in case something goes totally wrong, so a robust disaster recovery plan, covering different risk scenarios, must be created. This plan should state how functions and data can be restored quickly, as well as the probable impact of any data lost.

Conclusion

By implementing correct data management practices that include behavioral, procedural and technological controls, the risks of flawed, incomplete or erroneous data are mitigated.

Data integrity is about more than compliance with regulations; it is about protecting life-saving research and products for human use. In GxP applications, data often represents significant investments in development, clinical trials, donated tissue, and the hopes of patients for a new therapy or drug. The data represent assets that require fail-safe, trustworthy systems and practices that ensure patient safety.

The devices, software, infrastructure, processes and operating procedures must all be aligned to ensure that data are complete, consistent, accurate, and exemplifying the characteristics of ALCOA+.

 

Author: Piritta Maunu is a Life Science Industry Expert in Vaisala, with many years of experience in biotechnology. She has worked in quality management, R&D and GMP production. Piritta holds a M.Sc. in Cell Biology and is an instructor of General Biology.

As a manufacturer of environmental measurement and monitoring systems, Vaisala is invested in understanding the relationship between computerized systems, network functionality, device capabilities and data integrity. Over the past decade, the company has continuously developed its viewLinc monitoring system software including the important goal of ensuring data integrity.

Leave a Reply

Your email address will not be published. Required fields are marked *