In the blog, Informatics Challenges in Pharma and Biopharma R&D, we identified five major challenges facing R&D and IT managers in pharma and biopharma companies as they strive to equip their researchers with modern, flexible and responsive informatics tools:

This post is the fifth in a series examining these challenges in more detail, with insights gained while staff collaborated with customers to understand their requirements and devise and deliver solutions. The first four posts discussed Managing New ScienceDealing with ChangeIntegrating Old and New Systemsand Collaboration and Externalization, and now we will discuss the informatics challenges in addressing the needs of Harmonization and Simplification.

Opportunities to Reduce Complexity in Laboratory Data Management

Methods and techniques in pharma and biopharma R&D continue to evolve, and informatics systems struggle to keep pace. As we discussed in previous blog posts, IT staff and R&D managers have to overcome many challenges while keeping scientists and technicians working efficiently, often with reduced budgets and constrained resources.

Simply addressing each of these challenges individually, without considering adjacent applications and workflows and any underlying data model(s), can result in inefficient, disjointed processes and non-communicating silos of inconsistent data.  As Jerry Karabelas, one-time head of Novartis R&D, once observed when looking at another company’s vast databases of information, “Data, data everywhere, and not a drug, I think.”[i] The problem is echoed by Michael Shanler of Gartner: “Scientific innovation is being crushed under the weight of so many disparate and non-communicative systems, and the costs of maintaining yesterday’s status quo for laboratory informatics are dangerous. The status quo is still one of complication, redundancy and complexity. It is unsustainable.”[ii]

But as the noted writer on information design, Edward Tufte, points out, “Clutter and confusion are failures of design, not attributes of information,”[iii] and one solution to these failures is suggested by none other than Albert Einstein with his three rules of work: “Out of clutter find simplicity; From discord find harmony; In the middle of difficulty lies opportunity.”[iv]

Harmonizing Laboratory Data with LIMS Systems

Applying Einstein’s three rules to informatics systems, in order to achieve harmonization and simplification, can indeed lead to opportunities in the form of more efficient R&D workflows across multiple disciplines, and better access to data and information, which fosters enhanced, more timely decision-making and better science. And when simplification leads to the sunsetting of now redundant legacy applications, the result is reduced system maintenance efforts, streamlined system upgrades and lower total cost of ownership.

But what is required to achieve the desired harmonization and simplification, especially when labs need to stay operational and older systems can’t just be dropped overnight? A central requirement is a modern and extensible LIMS system and informatics platform with a data model that can handle the current, and likely disparate, data types and volumes it will encounter. In order to allow effective co-existence, interaction and data exchange with other still extant systems, the platform must also provide an Application Programming Interface (API).

With the API in place, trained in-house developers can link activities in extended, multi-disciplinary workflows; monitor, share and exchange data; and include external data in a consistent yet extensible master data model – all though configuration, rather than by writing hard-to-maintain, custom-coded, point-to-point connectors. As workflows change, new analytical and scientific techniques come on-line and different data types need to be captured, Thermo Scientific™ Core LIMS™ software can be easily and quickly extended by IT staff via simple configuration changes.

This means that LIMS system downtime is minimized, the users’ learning curve is barely perturbed, and researchers can take advantage of extended and potentially more complex data types that are now stored consistently and available for immediate search, visualization and analysis. And as the data and features from legacy applications become subsumed and available in the new informatics platform, the older systems can then be sunset, freeing up valuable IT resources that can better deployed on more value-added activities.

Thermo Fisher™ Platform for Science™ software provides all these capabilities from a single, modern technology stack that can be deployed in-house or in the cloud (public or private) and that is accessible with any web browser (including on mobile devices) for real-time data capture and sharing to speed research and innovation. With scalability and configurability built-in,  Platform for Science software can handle the increasingly large data sets generated by modern research techniques, and can manage not just the data but all the entity attributes, physical samples, material identifiers, complex workflows, requests and reports involved in today’s multi-disciplinary, multi-site R&D endeavors.

Customers are already reaping the benefits of using Platform for Science software to support their continually evolving informatics roadmaps; and achieving simplification and harmonization in their informatics tools, data structures and processes that provide for greater efficiency, better science and reduced maintenance costs. Learn how.


[i] Quoted in The Antidote: Inside the World of New Pharma, by Barry Werth, Simon and Schuster, Feb 4, 2014

[ii] Michael Shanler, Gartner, Inc., Research Note G00251083, 26 September 2013

[iii] Envisioning Information, by Edward R. Tufte, Graphics Press, 2003

[iv] Ideas and Opinions, by Albert Einstein, Bonanza Books, 1988 (reprint)