Common Mistakes in Computer System Validation

Posted on Laboratory Compliance. 3 December, 2018

Modern scientific laboratories increasingly rely on computerized informatics systems (e.g., LIMS, ELN, CDS, etc.) to process and manage data. To ensure product safety and effectiveness, it is important for these systems to be validated in a process known as computer system validation (CSV) to confirm the accuracy and integrity of processed data.

Regulatory agencies like the FDA, in fact, require CSV to verify and document that a computerized system does exactly what it is designed to do in a consistent and reproducible manner. CSV is required by the FDA when implementing a new system or when making a change to an existing system (upgrades, patches, extensions, etc.) that has been previously validated. Some examples of computerized systems that are required to be validated by the FDA include:

  • Laboratory data capture devices
  • Automated laboratory equipment
  • Manufacturing execution systems
  • Laboratory, clinical or manufacturing database systems

When done properly, computer system validation can be a smooth and efficient process. There are, however, several common mistakes that companies make when undertaking CSV that can result in a stalled or failed process. Let’s examine some of these mistakes in more detail.

Common Computer System Validation Mistakes

The goal of computer system validation is to document that computerized systems will work properly in all situations, consistently producing accurate results that enable regulatory compliance and the fulfillment of user requirements. CSV testing activities are conducted throughout the software development lifecycle (SDLC) – from system implementation to retirement.

There are many reasons why CSV activities can fail. Some of the more common ones that we see include:

Poor Planning. As with any project that involves technology, it is important to create a good plan. As such, a Validation Plan should be created prior to the start of any validation activities. This plan should detail the approach for maintaining validated status of the system over the full software development lifecycle (SDLC) and satisfy all regulatory policies and industry best practices (e.g., GAMP 5). The plan should also document the scope of validation activities, the testing approach, the testing team and their responsibilities, and the system acceptance criteria. The plan should also include site specific information if the system being tested is being used at multiple sites.

Poorly Defined Requirements. Any system implementation, upgrade, extension, etc. should be preceded by a thorough workflow analysis to develop a clear set of system requirements. Without clear and precise requirements, CSV will not be able to adequately verify that the system is functioning as intended. Well-designed functional and/or user requirements should be numbered and listed in a matrix that is traceable to other validation documents (e.g., Test Scripts, Design Specifications, Validation Summary Report, etc.).

Ambiguous Test Scripts. Poorly defined requirements lead to ambiguous test scripts. Without precise requirements, its hard to know exactly what you are testing for, and therefore difficult to confirm through CSV that the system in question is fulfilling its intended use or purpose.

Inadequate Definition of Expected Results. The expected results or acceptance criteria for CSV testing should be clearly and precisely defined in the Validation Plan.

Using Vendor Test Scripts. Vendor test scripts typically only validate the base system requirements and will thus not be sufficient to ensure regulatory compliance.

Inexperienced Project Team. The CSV project team should have experience with CSV and knowledge of regulatory guidelines/compliance, laboratory processes, and the technology being validated. In addition, outsourcing to a third party to augment the validation team with subject matter expertise may be necessary in some cases.

Inadequate Attention on the Project. Many times, a project will fall behind schedule simply because team members have to devote too much time to their day jobs. Members of the project team should be expected to have CSV as their main responsibility during the course of the project. An adequate project team, along with individual responsibilities, should be clearly laid out in the Validation Plan.

Wasting Time on Low Value Testing Activities. CSV activities can be time-consuming and costly. In order to avoid cost and time overruns, a risk-based assessment should be performed on the system to determine required test cases and the optimal level of testing for each. The project team should focus on what is practical and achievable for those aspects of the system that affect quality assurance and regulatory compliance.

Inadequate Documentation. Comprehensive documentation over the full SDLC needs to be created for all CSV processes and results in order to satisfy regulatory agencies. Not doing so can cause you to fail an FDA audit.

Conclusion

Computer system validation is an important part of confirming the accuracy and integrity of your data, along with ensuring product safety and effectiveness. Effective, risk-based validation of computerized systems is also an important part of maintaining regulatory compliance. Inefficient or ineffective CSV processes prevents projects from being delivered on time and within budget and may also result in regulatory action that can be legally and financially devastating to an organization.

Astrix Technology Group has over 20 years’ experience helping scientific organizations conduct effective CSV processes to reduce compliance risk and mitigate data integrity concerns and business liability issues. Our computer system validation professionals provide you with a best practice CSV methodology, along with the peace of mind that comes from knowing your CSV documentation has been produced by experts.

In this blog, we’ve presented some of the most common CSV mistakes that we have encountered in the field, but there are many other CSV mistakes that occur. If you have additional questions about computer system validation, or would like to have an initial, no obligations consultation with an Astrix informatics expert to discuss your validation project, please contact us.

About the Author

David Waters has worked for over 25 years in the pharmaceutical, biotech, and medical device industries and has done various validations including LIMS, Chromatography Data Systems, Laboratory Instrumentation, CAPA systems, and PLC/SCADA systems. David has also administered both SQL*LIMS and Labware LIMS. Prior to going into LIMS and Validation consulting work in 2003, David was R&D Manager of Analytical Development, QC Manager of Stability Services, and LIMS Manager/Administrator for a major pharmaceutical manufacturer.

About Astrix Technology Group

Scientific resources and technology solutions delivered on demand

Astrix Technology Group is an informatics consulting, professional services and staffing company dedicated to servicing the scientific community for over 20 years.  We shape our clients’ future, combining deep scientific insight with the understanding of how technology and people will impact the scientific industries. Our focus on issues related to value engineered solutions, on demand resource and domain requirements, flexible and scalable operating and business models helps our clients find future value and growth in scientific domains. Whether focused on strategies for Laboratories, IT or Staffing, Astrix has the people, skills and experience to effectively shape client value. We offer highly objective points of view on Enterprise Informatics, Laboratory Operations, Healthcare IT and Scientific Staffing with an emphasis on business and technology, leveraging our deep industry experience.

For More Information

For more information, contact Michael Michael Zachowski at mzachowski@astrixinc.com or visit our website at Astrixinc.com.

 

A Selection of Current Customers