In a recent blog, Informatics Challenges in Pharma and Biopharma R&D, we identified five major challenges facing R&D and IT managers in pharma and biopharma companies as they strive to equip their researchers with modern, flexible and responsive informatics tools:

  • Managing New Science
  • Dealing with Change
  • Integrating Old and New Systems
  • Collaboration and Externalization
  • Harmonization and Simplification

This post is the first in a series which will drill into each of these challenges in more detail, with examples gained while we staff collaborated with customers to understand their requirements and devise and deliver solutions. First we will explore the informatics challenges in managing new science.

Managing New Science

As Michael Elliott, CEO of Atrium Research & Consulting LLC points out, “The rapid scientific advancement in areas such as next-generation sequencing, precision medicine and next-generation biologics increases the volume, variety and the complexity of datasets. IT teams are having a difficult time keeping up with these changes”.[1]

There are several reasons why IT staff are having difficulties. They may attempt to adapt existing systems already in place to manage workflows and capture data, such as Laboratory Information Management Systems (LIMS), Electronic Lab Notebooks (ELNs) and small molecule registration and inventory systems. But when they do, they tend to find that these systems aren’t flexible enough to cope with the volume, variety and complexity of datasets listed by Elliott.

Complex Next Generation Sequencing Workflows

Next Generation Sequencing (NGS) workflows are complex, involving a variety of sequencing instruments (e.g., from Illumina, Ion Torrent™, PacBio or Life Technologies) running different analyses, numerous sample preparation kits and protocols and multiple liquid handling events. Given all the available sequencing technologies, assays, analytical instruments and automation robots leveraged in NGS, we estimate that there are > 10,000 possible unique NGS laboratory conformations.

Coordinating and tracking the activities, samples and data throughout such complex and potentially high volume, automated processes may well be beyond the capabilities of traditional LIMS. IT staff who have tried to shoehorn NGS data and workflows into small molecule-based, low throughput LIMS become frustrated after spending months writing custom connectors to stitch together the various components; and they typically end up with an inflexible and hard-to-maintain system that can’t be easily changed when a new NGS instrument or liquid handling robot comes online, or a different type of bio-entity or data needs to be captured and indexed.

Informatics System Complexity

Next generation biologics also increase informatics system complexity when compared with previous small molecule based tools and workflows. Instead of discrete and easy-to-track small molecule compounds, batches, lots and samples, biologics involve complex entities including sequences, linkages and ligated variants. Registering these entities as they are modified and pooled, assigning unique identifiers and tracking them are all crucial, and full lineage requires detailed and consistent information on the parents and progeny of the samples throughout. Paper-based operations and traditional LIMS, registration and inventory systems usually don’t offer these capabilities, particularly when samples are created and registered in bulk operations.

And this brings us to the first of Elliott’s disruptive V factors – Volume. We interviewed the Senior Project Manager of R&D IT at a large biopharma company and he admits, “The sheer volume of (NGS) information and the need to run reliably at scale requires storage, network and compute capacity that IT groups are not prepared for – it’s like dropping a bomb on them”.[2] Legacy LIMS systems are not flexible and extensible enough to handle the volume and complexity of the new data, and attempts to adapt older systems to support NGS activities are unlikely to succeed.

As we have reported in a previous White Paper, Next Generation Sequencing Comes of Age, successful NGS implementations share three key capabilities: they run processes as standardized operations; they track and measure outcomes; and they manage change and communicate value. Paper-based and legacy LIMS systems may offer limited aspects of these critical success factors, but a forward looking, robust, high throughput automated environment will need more than this. As the Managing Director of a genetics testing lab confirmed, “Ultimately, technology exists to support the people doing the work for patients. When you build a good system with quality management, electronic document control, and LIMS to track and trace everything you do, you get to the end game where everything proceeds efficiently, with minimal touches, and the team can focus on delivering the results that will change the way medicine is practiced”.[3]

Thermo Fisher™ Platform for Science™ software with its flexible, configurable technology and applications is providing all these capabilities to a range of customers so that they can manage the new science that will give them a competitive edge and help them deliver novel therapies faster.

Learn more.


[1] Michael H. Elliott, CEO, Atrium Research & Consulting LLC, European Pharmaceutical Review Informatics in-depth focus, 3 September 2015

[2]  Core NGS White Paper

[3] Core NGS White Paper