Reducing the Cost of Computer Systems Validation Efforts Through Risk Assessment

Posted on Lab Compliance. 13 November, 2019

Computerized systems have been widely adopted by the pharmaceutical industry and are frequently used for instrument control and data evaluation, documentation, transmission and archiving in laboratories. The FDA requires that all computer systems in regulated environments be validated in a documented process known as computer system validation (CSV). CSV serves to verify that a computerized system does exactly what it is designed to do in a consistent and reproducible manner.

For regulatory compliance purposes, CSV confirms the accuracy and integrity of data created, modified, archived, retrieved and transmitted by a computer system in order to ensure product safety and effectiveness. Computer system validation is required when configuring a new system or making a change in a validated system (upgrades, patches, extensions, etc.).

Depending on the complexity and functionality of computer systems, CSV can be a significant undertaking. Pharmaceutical companies have even been known to delay upgrading applications to avoid the validation effort required. In this blog, we will provide some tips on performing a risk-based CSV in order to dramatically lower the costs of validation efforts.

GAMP 5 Validation Guidance

To help pharmaceutical companies understand and meet CSV regulations, the International Society for Pharmaceutical Engineering (ISPE) publishes the Good Automation Manufacturing Practices (GAMP). The current guidance document, GAMP 5: A Risk-Based Approach to Compliant GxP Computerized Systems, aligns with current FDA thinking and has become the CSV “bible” for the pharmaceutical industry. This guidance document provides pragmatic and practical information detailing how to implement a flexible, risk-based approach to CSV.

Laboratory Informatics System Risk Assessment

The FDA expects pharmaceutical companies to document that computer systems involved in GxP will work properly in all situations. As CSV generally takes significant time and IT resources to accomplish, however, zero system risk is both impractical and unattainable. In recognition of this fact, the FDA supports a flexible GAMP 5 approach that applies a risk-based assessment on the system to determine necessary test cases and the optimal level of testing for each. This risk-based approach to CSV allows companies to perform the right amount of CSV with the right amount of detail for each GxP system. If companies fail to do a risk assessment on a system, the FDA holds them responsible to perform a full validation of the system.

With a GAMP 5 risk-based approach, the right amount of validation testing on a system will depend on the impact that the system has on data integrity, and ultimately, product (drug) quality. For example, a system used in early stage R&D will have a relatively low impact on finished product quality, and thus require less validation testing than a system used for product quality control.

Every laboratory informatics system (LMS, LES, ELN, SMDS, CDS, etc.) software validation project should start with a formal risk assessment of that answers the following questions:

  • What could go wrong with the system? (risk identification)
  • How likely is it that the system will not function properly at some point? (system complexity)
  • What are the consequences of the system not functioning properly? (system criticality)

Risk identification can come from a variety of places – system stakeholders, historical data, subject matter experts (SME), vendor status (driven by vendor audit), etc. A thorough process of identifying all potential system risks (system failures) should be undertaken by qualified personnel and these potential risks should be documented.

Once the potential risks of system failure have been identified, they should be classified in terms of their likelihood of occurring. The likelihood of system failure can be thought of as a measure of system complexity, and is classified in GAMP 5 guidelines in the following way:

  • High: Custom developed functions within custom or commercial-off-the-shelf (COTS) systems
  • Medium: Configured functions within COTS systems
  • Low: Standard (non-configured) functions within COTS systems

Identified risks should also be classified according to their consequences, which is a measure of system criticality. This analysis will evaluate the kinds of data being produced by the laboratory informatics system in question to determine if it has any impact on data integrity, patient safety or product quality. Identifying consequences for each potential risk is best done by an SME who has knowledge of both laboratory processes and the system being validated. System criticality levels are characterized in the following way:

  • High: Direct impact on data integrity, patient safety or product quality
  • Medium: Indirect impact on data integrity, patient safety or product quality
  • Low: No impact on data integrity, patient safety or product quality

The Validation Plan

As part of the documentation required by regulatory agencies, a Validation Plan document should be created prior to the start of any validation activities on the system in question. The Validation Plan document will fully detail the scope of validation activities, the testing approach, the testing team and their responsibilities, and the system acceptance criteria. The plan should also include site specific information if the system being tested is being used at multiple sites.

Once the three questions mentioned in the previous section have been answered in detail for the system, the information should be documented in a Risk Matrix that will become part of the Validation Plan. This Risk Matrix will essentially become the justification for the validation approach documented in the Validation Plan. Some of the benefits of this risk-based approach to CSV include a significant reduction in cost and duration of the validation efforts.

Conclusion

A risk-based approach to CSV ensures that the system functionalities with the highest risk receive the most concentrated validation effort. Ultimately, the extent of validation should be determined by the risk a specific computer system or functionality can have on data integrity, and product quality and safety. But one must do more than simply claim that a risk-based approach to CSV is being used to justify your validation activities to the FDA. A comprehensive risk assessment needs to be documented in the Validation Plan as justification for the extent of validation efforts on the system.

Astrix Technology Group has over 20 years’ experience helping scientific organizations conduct effective CSV processes to reduce compliance risk and mitigate data integrity concerns and business liability issues. Our computer system validation professionals provide you with a best practice CSV methodology, along with the peace of mind that comes from knowing your CSV documentation has been produced by experts. If you have additional questions about computer system validation, or would like to have an initial, no obligations consultation with an Astrix informatics expert to discuss your validation project, please contact us.

About Robert Knippenberg

Robert Knippenberg Rob Knippenberg is a Managing Director for Astrix Technology Group in the Informatics Professional Services Practice.  He is focused on customer informatics solutions delivery through strategic partnerships with many of the top scientific software and services providers. Mr. Knippenberg brings to the role over 25 years of experience in scientific software project and program management, directing widely distributed global teams.  During his career, Mr. Knippenberg has worked with hundreds of commercial, academic, and government institutions delivering scientific informatics solutions to many thousands of scientists

A Selection of Current Customers