Utilizing Vendor Scripts for Successful User Acceptance Testing

Posted on Lab Informatics. 8 August, 2018

Computer system validation (CSV) is a documented process which helps to ensure that both new and existing computer systems consistently fulfill their intended purpose and produce accurate and reliable data. Most regulatory agencies around the world require validation of computer systems to ensure product safety and effectiveness. The software implementation and development team typically conducts CSV processes on the system being installed/updated to eliminate any software bugs and make sure the system meets specifications.

Another important part of validating a computer system is User Acceptance Testing (UAT). UAT is performed by users familiar with the business requirements, typically after the functional, system and regression testing are completed. That said, the process for UAT varies widely across organizations and industry. Some organizations perform UAT right along with the other types of validation testing as part of the development process. This as-you-go methodology helps to get buy-in from the users on what is being developed and can sometimes be translated into usable validation scripts.

Some organizations make the mistake of not conducting UAT at all, while others simply have users repeat the functional testing conducted by the development team, essentially revalidating the system after the development team does. The bottom line is that UAT provides an important opportunity for users to check that the system developers have properly interpreted and implemented the functional requirements for a computer system prior to system rollout.

Many organizations realize the importance of UAT only after suffering financial losses associated with misconstrued requirements. Given that the cost of fixing software defects after release is many times greater than fixing these issues during development, it is critical that UAT be done properly. In this blog, we will discuss UAT best practices, including the use of vendor test scripts, to ensure a successful go-live event.

User Acceptance Testing Best Practices

The goal of functional software testing is to make sure the software is bug free and meets the specifications detailed in the functional requirements. UAT testing helps to make sure that the system does what it is intended to do from a user perspective. For UAT, you must arrange for the people who will actually be using the software to test it in a simulated laboratory environment with realistic data, possibly even under worst case conditions, to try to make it fail. This process serves to verify that the system meets the requirements from the user’s perspective and will function as expected once it is deployed to a live environment.

A few additional best practices for UAT include:

Plan Properly. As complete test execution for a large application like a LIMS is not feasible, it is important to define test objectives by prioritizing critical business objectives using a risk-based assessment. A set of real-world use cases should be identified for execution in UAT, and these should be formally written up in clear and simple testing scripts. Time should also be allocated for developing an appropriate UAT testing environment. A testing plan should be developed that contains all relevant information necessary for conducting UAT – the dates, testing environment, testers, communication protocols, roles and responsibilities, templates, acceptance criteria, etc.

Choose a Good Team. UAT testers should ideally be chosen from every group or area of the organization that will be using the software so that all applicable user roles can be tested. The testers should be familiar with the new requirements that the system is intended to fulfill. It’s also important to choose a subject matter expert to manage the UAT process.

One of the biggest mistakes that companies make is to pass off the UAT to the functional testing team. This misses the whole point of UAT – the users will quickly be able to spot issues that the functional testers missed.

Define Your Acceptance Criteria. A set of acceptance criteria needs to be defined that will be used to determine if the software meets essential user requirements and is performing at an acceptable level for business users. The software must meet these criteria in UAT before the customer accepts it. All stakeholders involved in UAT need to be aligned on what pass/fail means as defined by the acceptance criteria.

Create a Good Testing Environment. Usually, UAT testing is performed in a designated physical space (e.g., conference room) that allows users, project manager and QA team representatives to sit together and work through the test cases. This process may need to happen in multiple locations for global organizations. Additionally, as described above, a virtual laboratory environment needs to be developed so users can test real-world use cases.

Utilizing Vendor Scripts for User Acceptance Testing

Another best practice recommendation for UAT is to utilize vendor scripts when feasible. Some vendors provide a set of validation scripts that can be used to validate standard functionality in their software as part of an overall package that you get when you purchase their software. Oftentimes, with just a few adjustments for the specific user environment, the project team can turn these vendor scripts into UAT scripts and save a lot of time in the process. The questions to ask when considering the use of vendor scripts for UAT testing

  • Do the vendor scripts accurately capture the software’s intended use as configured?
  • If not, can they be adjusted easily to serve in this regard?

Tips for working with vendor scripts for UAT testing:

Ensure the Vendor is on an Audit Schedule. Vendors are typically audited by their customers on a recurring basis (once every 2 years). This makes getting the vendor test scripts though QA review much smoother.

Use only what you need. Many vendor validation packages include testing above and beyond what is needed to validate a specific implementation. As stated above, UAT test objectives should be defined by prioritizing critical business objectives based on a risk-based assessment

Verify that the validation package was created with the same version of the product that you are using. A validation package created for version 3.0 of a specific software is probably not very useful when you are implementing version 2.0 or 4.0.  Some scripts may still be valid, but others may be obsolete or not accurately reflect the implemented version.

Conclusion

For most laboratory informatics software, extensive validation is required to ensure that these systems consistently fulfill their intended purpose and produce accurate and reliable results that enable regulatory compliance, fulfillment of user requirements, and the ability to discern invalid and/or altered records.

User Acceptance Testing is one of the most important aspects of computer system validation. UAT frequently uncovers missed issues and helps to clarify requirements for informatics software, as users are able to test specific aspects of your platform that were not adequately tested in the development environment.

When viewed in the light of business and operational risks, the time and resources spent on an effective UAT process prior to go-live is time well spent. Vendor scripts can be extremely helpful in order to save time and reduce costs of UAT, as well as give users a sense of investment in the next version of your software to encourage productivity and adoption.

Astrix Technology Group professionals have over 20 years of validation experience and regulatory compliance expertise. We understand that computer system validation and UAT are not “one size fits all” processes, and we work to create testing processes that are based on applicable regulations and guidance, best practices for the domain, and the characteristics of the system being tested. If you would like to discuss your informatics system validation and/or UAT testing strategy with an Astrix Informatics expert, please do not hesitate to contact us.

About the Author

Nicholas Osto is a Senior LIMS Consultant for Astrix Technology Group in the Informatics Professional Services Practice.  He works with key project stakeholders to develop, configure, and validate LIMS implementations.   Mr. Osto has over 10 years of pharmaceutical industry experience specializing in development and validation of both Labware and STARLIMS enterprise systems.

A Selection of Current Customers