OUR
INSIGHTS

Interviews

Software Integration in the Laboratory: Leveraging Advanced Tools

CATEGORY
Interviews

DATE
June 21, 2021

Share this...
Share on facebook Share on twitter Share on linkedin Youtube
Article Excerpt Featuring Dave Dorsett,  Astrix Principal Software Engineer

The concept of holistic lab orchestration hinges on addressing three basic problems, says Trish Meek, director of marketing at Thermo Fisher Scientific. ‘Firstly, getting all of the data together so that you can use it for analysis, visualisation, and increasingly, for leveraging AI and advanced machine learning tools. Then there’s the human experience in the lab. How do you optimise the scientific experience for scientists day-to-day. Third is the need to improve and facilitate process optimisation.’

Instrument and software vendors are already making moves to facilitate easier integration, Milne acknowledged. ‘From a lab connectivity perspective, then when we look at instrumentation and software used for everyday lab work, such as sequencers, qPCR, flow cytometry, etc., the vendors of these types of instrumentation are already starting to think about connectivity when they develop their new instruments.’ Thermo Fisher Scientific, for example, is building connectivity into all of its new instrumentation, he continued. ‘But then there will also need to be some sort of ‘retrofittability,’ and that will be part of our initial offering. This will be achieved through the creation of a gateway that will make it possible to connect instruments and software in the lab, into a cohesive environment.’

While there are possibly multiple aspects to the issue of achieving seamless connectivity in the lab, the ultimate aim is to make laboratory systems more effective at what they do, every day, Dave Dorsett, principal software architect at information technology consultancy Astrix Technology, suggests. ‘That’s a foundational concept; how to improve usage of systems – such as a LIMS or ELN platform – from the perspective of everyday use, and how to get these systems to work together to support the labs on a day-to-day basis.’

Consider the software and hardware tools that a lab ecosystem relies on, and much of the interruption in integration will commonly be due to the diverse nature of instrument architecture, Dorsett noted. An organisation may have LIMS systems from multiple vendors in use across different departments, for example, he said, mirroring Milne’s sentiments. ‘Some of these systems, whether LIMS platforms or other hardware or software, are more challenging to integrate than others. And this makes it costly for individual companies to set up and maintain them from an integration perspective.’

What this means at the most basic level, is that many labs may still rely on manual data transcription or ‘scientist-facilitated integration’, Dorsett continued. ‘“Sneakernet” [physically transferring data from one PC to another using portable drives and devices] remains just part of everyday lab life. And no matter how careful you are with manual transcription and data input, or how effective your data review processes, the ultimate quality of that data is always going to be at risk.’

There are two challenges, in fact, Dorsett suggested. Sometimes the issues are not so much with getting systems to talk to each other, as they are with aligning and harmonising the data that comes out: getting data out of point systems and enabling the flow to the next stage represents another stumbling block to seamless lab integration, Dorsett suggested. If it’s hard to get data from a LIMS, ELN or other key piece of software back out as accessible and meaningful, then it may not be possible to use that tool or platform to maximum effectiveness and efficiency.

Dorsett continued: ‘One approach to addressing such issues is to bring data from multiple systems into data lakes, where it can feasibly be compared, but again, you have to ensure that your data are equivalent, particularly where your labs may be running multiple LIMS or ELNs, for example. You may have one LIMS for stability testing, and another for batch release, plus method data in an ELN.’

A typical problem organisations face is how to compare all of that data once you have technically integrated your systems. ‘For any laboratory organisation, one of the biggest challenges to using the systems that they want to integrate, is how to ensure both data quality and data comparability/equivalence across systems, even once they are interconnected. Are your experimental methods equivalent, for example, or does a sample ID from your LIMS match a sample ID from a CRO?’ Dorsett said.

It’s important to try to understand what tools are used at the level of the lab, facility and enterprise, as the basis for working out how to maximise use of that collective investment, identify key gaps, and define a longer-term roadmap that recognises the importance of sustainability and total cost of ownership. ‘You want to try to find ways of using integration technologies that are already there more effectively, as well as to be able to bring in new technologies,’ Dorsett said. For instrument integration, there are middleware companies who are positioned to offer specific software to facilitate instrument integration, Dorsett suggested, citing SmartLine Data Cockpit, TetraScience and BioBright – the latter having been acquired by Dotmatics in 2019. ‘These companies are focused on providing tools that can address how people gather all their data from the different instrumentation,’ said Dorsett.

Read the full article here: Scientific Computing World,  June 18, 2021 issue

Intelligent integration | Scientific Computing World (scientific-computing.com)

Related

LET´S GET STARTED

Contact us today and let’s begin working on a solution for your most complex strategy, technology and staffing challenges.

CONTACT US
Web developer Ibiut