LIMS Master Data Best Practices Part 4 – Quality Control
So far in our LIMS master data best practices series, we have discussed how to define master data and create a Master Data Plan, how to effectively extrapolate master data from current records to configure your system, and how to configure your master data so it will be easy to maintain and scale as your organization grows and the system matures.
The Master Data Plan, along with other documents we have discussed in previous blogs in this series, are part of an overall quality control process. High quality data is:
- Valid – The data follows the current rules established for its type
- Unique – The data is not duplicated or redundant elsewhere in the system
- Consistent – The data is the same across all environments and systems where it is used
- Timely – The data represents only what is required at the present point in time
- Accurate – The data is free of errors and representative of reality
- Complete – All values required for use by customers and system users are available in the system
Any changes to master data within a LIMS (or any other GxP governed computerized systems) should follow a formal Change Management SOP that defines the procedures for maintaining master data validity. A good change management procedure will cover all of the above-listed aspects of data quality to ensure master data quality is maintained and it remains fit for intended use.
A good Change Management SOP will be risk-based, meaning that changes with potential for higher impact will have a more extensive procedure than changes with lower impact. The SOP should have instructions on how to assess risk, how to handle objects of each risk class, and explain what triggers initiation of the change control process. Let’s look at some of the key information that should be included in your Change Management SOP in more detail.
Routine Change Control
Routine change control is the process defined for low and medium risk changes made to the LIMS. Routine change control is triggered on an as needed or recurring basis depending on requirement changes or a need for system maintenance. By defining both the process and what triggers the process, your SOP helps ensure the master data in the LIMS is an accurate representation of the current business requirements at that point in time. Key processes for updating master data that should be formalized in your SOP include:
Update all data impacted by the change. When making changes to your LIMS, be sure to consider all fields in the system that will be affected. Adding instructions to the SOP to check the LIMS Master Data Plan will help to ensure all data impacted by the change is updated at the same time. When it’s not, it leads to a cycle of continuous changes that cost time and effort.
Be sure to update all environments and systems impacted by the change. Most LIMS systems have a secondary environment where major changes can be developed without affecting the production system. When no major changes are needed, these environments get out of synch if they are not updated regularly. When the environment needs to be used, it often takes time to update before any work can take place. Likewise, if a change is made to a system that the LIMS uses as a reference (such as adding a new laboratory instrument), if the other system is not updated as well, data consistency and accuracy are lost. During any change, it is important to include a check in these other environments and systems.
Avoid data duplication. Another issue to be aware of is that duplicates (or similar data) are often created when a deactivated field is added as a new item instead of re-activating the original field, or when a new list item is added in uppercase when a lowercase version already exists in the database. If a new workflow or product is added, fields are created specific to the product or workflow that can be very similar to existing fields. Through these seemingly minor changes data uniqueness is lost over time.
Perform periodic review and audits. One of the best ways to keep data consistent and unique is to perform periodic reviews and audits. Change Management SOPs should include how to conduct reviews and audits, a risk-based schedule of how often data should be reviewed, and templates to perform the review. A few suggestions of what to review are:
- User Accounts: For companies who must comply with 21CFR part 11 and Annex 11 regulations, user accounts must be audited on a regular basis. Even for companies who are not under these regulations, auditing user accounts is highly recommended to ensure your data is secure.
- Sample points and sample schedules: Because these are activated and deactivated often, it is easy to forget when they have been deactivated for long periods of time. Those that are no longer used should be moved to an archived or decommissioned status.
- Instruments and equipment: Even when instruments and equipment are stand alone systems, they are often held in lists in other systems such as a LIMS. Those that can interface with a LIMS or other system, such as HPLC or MS instruments, are often modular, and the modules can be changed out. These should be reviewed to verify the current status and decommission or archive anything that is no longer in use.
High Risk Changes
High risk changes to a LIMS are often triggered by the need for a data migration or an upgrade to a system. Processes that should be formalized in your Change Management SOP when updating master data for high risk changes include:
Define the extended process. For a high-risk change, the Change Control procedure should include pre-approval of the change, testing and validation of new data or processes, and post-approval. For data migration, a Data Migration Plan should be in place to manage the change. For system updates, design and development documentation and code review should be included if any programming or custom coding is involved.
If a development phase for the change is required, a snapshot of the current system should be used. This freezes data so developers can create changes without adding the complexity of routine updates. This is also a good time to create or update the Master Data Plan for all systems involved. Using the periodic review and auditing procedures and templates can help with this process.
Utilize a current snapshot of the system for validation activities. Development and testing can often last months and even years depending on the nature of the change. During this time, production data is growing and routine change control is still occurring, so data accuracy and consistency between the production and development environments is lost. When it comes time to validate new data or processes, the results may not be accurate because the system will have changed.
To combat this problem, a current snapshot of the system should be used for validation. If a validation environment is available, this means migrating the current master data from the production environment to the validation environment just before validation activities take place. If a validation environment is not available, current master data should be migrated to the development environment. This ensures data used is accurate, timely, and consistent with the current system state. It also reduces the risk of errors during the validation phase.
Once validation is complete, the change may be moved to the production environment. Post approval activities should include a verification check to ensure changes are complete before releasing the system to users.
Conclusion
A change management procedure is the key to ensuring master data quality is maintained. This procedure should be risk-based to provide appropriate instructions and triggers to identify and manage each level of change. Routine change management procedures will describe how to keep data accurate and timely and ensure data updates are valid and complete. Periodic reviews and audits as well as the Master Data Plan should be included to ensure data stays consistent and unique. High-risk changes must include how to deal with all aspects of data quality through the different phases of the change.
Be sure to tune in for part 5 of our Master Data Blog Series, where we will discuss important considerations for master data management in mergers and acquisitions.
Astrix is a laboratory informatics consulting firm that has been serving the scientific community since 1995. Our experienced professionals help implement innovative solutions that allow organizations to turn data into knowledge, increase organizational efficiency, improve quality and facilitate regulatory compliance. If you have any questions about our service offerings, or if you would like have an initial, no obligations consultation with an Astrix informatics expert to discuss your master data strategy or LIMS implementation project, don’t hesitate to contact us.
Case Study: LabWare Centralized Data Review for a Global Biopharmaceutical Company
Overview A global biopharmaceutical company specializing in discovery, development,... LEARN MOREWhite Paper: Managing Data Integrity in FDA-Regulated labs.
New White Paper LEARN MORELET´S GET STARTED
Contact us today and let’s begin working on a solution for your most complex strategy, technology and staffing challenges.
CONTACT US