The Astrix Blog

Expert news and insights for scientific & technology professionals.

The Life Science Industry Blog for R&D Professionals

Highlights from the Smartlab US Exchange 2018

San Diego, CA.  February 20-21, 2018

Over 80 senior lab informatics professionals gathered in San Diego, CA on February 20-21 for the Smartlab Exchange 2018 to share experiences and discuss challenges associated with streamlining the laboratory operating environment.  Topics were varied, with presentations from a number of IT and scientific professionals across multiple industries.

We sat down with Astrix’ President, Dale Curtis, to discuss some of the key highlights and takeaways from the show.

What was the format of the conference?

The conference had a mixed format, which included presentations, along with focused business meetings, round table discussions and “Think Tank” group break-outs.  Think Tank break-outs were smaller groups focused on specific topics, such as “Integrating informatics with future technologies,” and “Lowering implementation & maintenance costs of quality systems (LIMS, ELN/LES) with smart data preparation.”  It was a unique opportunity to network with informatics professionals and scientists from a number of industries, ranging from drug discovery and development to agribusiness to QC testing labs.

What were your general impressions?

The conference featured a relatively small number of companies focused on the same area, laboratory informatics.  In that regards, it was a unique opportunity to really engage on specific topic areas of interest. Getting to see how companies are driving innovation in the midst of common constraints such as regulatory compliance, talent shortages, antiquated technology stacks, etc. was a great part of the conference.  It was both interesting and informative to see how different companies are approaching the journey to digitally transform the lab.

What did you learn about how companies are approaching the laboratory digital transformation process?

Some companies are taking a more disruptive approach, while others are working from the ground up to integrate equipment, data capture processes, and analytics.  Brandon Dohman, who heads up Syngenta’s Digital Innovation Lab in Illinois, walked us through their efforts to drive continuous innovation using a Lean Startup model, summarized in the figure below.  The Lean Startup methodology starts from the premise “Should this product be built” rather than “Can this product be built?” The idea is to first figure out the problem that needs to be solved and then develop a minimum viable product (MVP) to begin the process of learning as quickly as possible. Once the MVP is accomplished, the developers can work on fine tuning the solution using actionable metrics and data.

Brandon shared some early successes from his group. One involved using machine learning to develop an algorithm to automatically detect which crops are being grown and where. EPA documentation was used to train the algorithm, and it can be applied worldwide using global satellite imagery. Another example project significantly increased the speed with which the group was able to extract important data from PDF documents, placing it into a format that is readily available to researchers at Syngenta.

Reference – The Lean Startup Principles – http://theleanstartup.com/principles

Meanwhile, John Harmon of Neurocrine Biosciences presented a multi-year effort to rebuild the research informatics infrastructure using the CoreLIMS  Platform for Science, beginning in 2009. In addition to technical requirements, John shared a number of “philosophical” requirements for the platform, including the idea that the platform will fit the scientific workflow (i.e., not doing science around the platform), and the platform can change without vendor involvement. Ultimately, the platform was shaped into a comprehensive solution for compound management that supported everything from laboratory automation integration (including chemical library enumeration) to HTS to R-integration for in-vitro data analysis.

What about the Allotrope Framework that was recently released…was there talk about it?

The Allotrope Foundation is a group working on revolutionizing the way we acquire, share and gain insights from scientific data through a framework for standardization and linked data. The recent release of the Allotrope Framework and ADF represents an exciting advancement in the laboratory informatics space that supports scientific discovery and innovation. Adrienne Tymiak, Executive Director of Bioanalytical and Discovery Sciences at Bristol-Myers Squibb, made the case for facilitating information exchange between labs and breaking down unproductive data silos partly by relying on data standardization and ontologies, including the Allotrope Data Format (ADF). She showed why capturing the data context is critical for unlocking the potential of advanced analytics and computational modeling capabilities. Eric Little, Chief Scientific Officer at Osthus, gave a presentation on taking advantage of upstream data by utilizing semantic technologies to augment laboratory instrument data with Reference Data structures – using the ADF file format.

There were a few presentations on the Lab of the Future.  Can you tell us more about that?

 Representatives from Merck (Dave Kniaz and Vikas Patel) presented their perspective on the “Lab IT Ecosystem of the Future,” ultimately designed to enable a future state which centers around a “digital research experience” for the scientists. They talked about the current state of IT capabilities for many of today’s life science R&D organizations, which are often plagued with point technology solutions and outdated architectures, creating large fragmented information siloes. They made the case for the need to transition to the Lab IT ecosystem of the future, which will ostensibly contain a flexible, extensible and scalable IT platform required to enhance the capabilities needed to achieve benefits desired in the lab of the future. In addition to modernizing IT architectures and user interfaces that rely on cloud technologies, the speakers talked about the need to design the physical lab environment to integrate seamlessly with IT applications, such as human-machine interfaces that rely on voice and visual cues. The need to adopt pre-competitive data standards (such as the Allotrope Data Format) was also discussed.

Richard Caron, Associate Director of Quality Management IT at Eli Lilly, also touched on their perspective in the presentation, “Implementing a digital roadmap to transform quality/manufacturing operations.”  Richard’s team has developed a roadmap to guide the quality organization through the digital transformation happening in manufacturing.  The presentation walked us through Lilly’s journey by looking at the business drivers, digital principles (based on Industry 4.0 transformation pillars) and a manufacturing reference architecture.  Richard presented a Digital Plant Maturity Model (five levels), which starts with pre-digital plant and ends with an adaptive plant that features a full end-to end value chain integration from suppliers to patients.

Any other noteworthy themes?

Agile was a popular discussion topic that had many applications. There was a panel discussion centered on the topic of “Getting the most from your strategic technology partners.”  It included representatives from the software industry, pharma/biotech, as well as consulting firms. One of the takeaways was that vendors, sponsors and consulting firms need to work together in an agile way to make the most of their collaboration. Goals and timelines may not be firm, and there may be uncertainties in the project plan. It is therefore critical to communicate about assumptions and expectations and be willing to pivot when needed to change course.

From a software development perspective, Agile approaches are becoming more prevalent. But at the same time, these methods are often in conflict with traditional waterfall approaches that are often embedded in the SDLC and processes of many large companies, particularly when GxP projects are undertaken. There was some talk about using Agile methods in the context of waterfall project plans to move the needle forward. Nonetheless, this is an area that needs further attention from the industry to capitalize on the promise of Agile software approaches.

Data integrity was another notable theme that many were talking about in response to the rising number of deviations/warning letters in GMP manufacturing, along with increased emphasis from regulatory authorities. Don Crossett of Avid Bioservices (a contract development and manufacturing organization, CDMO) provided a compelling presentation on going paperless without sacrificing data integrity. He talked about an important initiative he led to promote quality and data integrity across the organization. He stressed the importance of IT in the quality management lifecycle and why quality is everyone’s responsibility. Not only that, but for CDMO’s, quality is a competitive advantage.  Don cited a 2016 American Pharmaceutical Review survey that showed quality/accuracy of results was the most important performance metric to sponsors when evaluating CDMOs