In a recent blog, we identified five major challenges facing R&D and IT managers in pharma and biopharma companies as they strive to equip their researchers with modern, flexible and responsive informatics tools:

  • Managing New Science
  • Dealing with Change
  • Integrating Old and New Systems
  • Collaboration and Externalization
  • Harmonization and Simplification

This post is the second in a series which will consider these challenges in more detail, with examples gained while we collaborated with customers to understand their requirements and devise and deliver solutions. The first post, Informatics Challenges in Pharma and Biopharma R&D – Managing New Science, discussed managing new science, and this time we will explore the informatics challenges in Dealing with Change.

As the French critic Jean-Baptise Alphonse Karr memorably stated, “plus ça change, plus c’est la même chose” (the more things change, the more they stay the same) but we can assume that M. Karr didn’t have to manage the informatics infrastructure in a fast paced biopharma lab, where change is a fact of life.

Science, analytical techniques and experiment design are evolving rapidly. R&D organizations are moving from small molecule to biotherapeutics, and are keen to handle the new types of data generated at volume by automated workflows and modern techniques such as next-generation sequencing (NGS). Michael Elliott of Atrium Research & Consulting LLC highlights the challenge: “The rapid scientific advancement in areas such as next-generation sequencing, precision medicine and next-generation biologics increases the volume, variety and the complexity of datasets. IT teams are having a difficult time keeping up with these changes”.[i]

As we discussed previously, establishing enhanced informatics systems to effectively manage new science is a widespread and pressing need. But even when a new system is successfully implemented and deployed to researchers and technicians, things won’t stand still. Legacy systems may need to be maintained and the new systems will evolve. And although he also didn’t manage any IT infrastructure, the philosopher Heraclitus was correct in observing that “there is nothing permanent except change”.

Even the best run lab will need to make changes as they extend workflows, generate different types of data and deal with new staff and collaborators. Examples include:

  • Modern workflows such as NGS involve multiple sequencing technologies, assays, analytical instruments and automation robots, and span several departments and user types. Any of these is susceptible to change, e.g. as a reagent kit is upgraded, a new liquid handler comes on-stream or a new data-dependent trigger is added to workflow.
  • In animal studies, a standard set of in vivo bioassays measuring e.g. dose, pharmacokinetics and pharmacodynamics may need to be extended to capture new behavioral observations of the subjects, spread over a long time-course, and with new types of non-structured data. These in turn will need to be included in the overall data model in a consistent way so that all users of the new data know how to access and use it correctly.
  • In the increasingly common, extended virtual research environments used in today’s biopharma industry, the client company will want to quickly add new collaborators and CROs to provide specific skills and resources; and in such a way that the CROs can access only the data they need to see, while the parent company has secure access to the whole corpus of data in the appropriate format and with the right tools at hand.

In all these cases, the required change (whether to workflow, data type or user definition and permissions) should occur in a matter of hours rather than weeks or months. Rather than hoping for increased priority in the pile of change orders submitted to IT, or waiting on an external vendor to take a look at the request, organizations need to be able to respond quickly to the changed circumstances so that productivity and efficiency are maintained. This requires a readily adaptable informatics infrastructure like Thermo Fisher™ Platform for Science™ software that can be amended and extended through simple configuration changes rather than having to do custom coding.

Platform for Science software with its flexible, configurable technology and applications is addressing this constant need for change to a range of customers, so that they can rapidly adapt in ways that will give them a competitive edge and help them deliver novel therapies faster. Learn how.

[i] Michael H. Elliott, Atrium Research & Consulting LLC, European Pharmaceutical Review Informatics in-depth focus, 3 September 2015