Lab of the Future is now

LabTwin News

Building the Lab of the Future

Guru Singh
Guru Singh December 20, 2019

Pharmaceutical companies are constantly grappling with huge datasets. Technologies are rapidly advancing, allowing researchers to collect vast amounts of preclinical and clinical data during product development. The problem now lies in how to best manage that data to unlock its true value. In the Lab of the Future, cloud technologies could provide an ideal solution.

It was clear from speaking with biopharma leaders, that we, as an industry, have several hurdles to overcome before we can realize the vision of a seamless, integrated, automated research lab. To start with, lab digitization tools must continue to become more user-friendly. For example, mobile voice-powered digital lab assistants, designed to be simple and easy to use, are starting to replace static electronic lab notebooks.
Artificial Intelligence (AI), which has the potential to transform research labs, must synergize with human scientists, so researchers welcome digital tools into the lab, rather than feeling threatened.
Furthermore, the data avalanche that life science companies are experiencing will only continue to snowball as AI, machine learning and hardware tools evolve. Therefore, data management is a growing issue for every research laboratory.

Life Science Lab Digitization 

Martin Clausen, Technology Manager Assay Solutions, Bayer Pharmaceuticals, shared how end-to-end digitization of preclinical research labs can positively impact research efficiency. He started by explaining that preclinical data assets are generally stored in silos. This is because, unlike clinical data, preclinical data is created by interdisciplinary teams and comes in diverse formats. Preclinical teams face challenges with correctly annotating and harmonizing this data. 
Clausen reported that lab digitization can improve data management and therefore, increase the chances of taking the right molecules through into clinical testing. He mentioned that the right combination of commercial and custom software, as well as appropriate enterprise data infrastructure, can reduce the time and effort required for complex digitization projects without compromising objectives.

AI in the Lab 

During the congress, LabTwin joined leaders from Amgen, Actelion, Bayer Pharmaceuticals and Merck to discuss how AI will impact pharma R&D. Participants agreed that AI could improve productivity at every stage of the R&D process. However, for AI to be a valuable research tool, subject matter experts and data scientists must work together to train algorithms and label data. Even more importantly, the lab informatics industry must work to build scientists’ trust in AI solutions. Pharma leaders believed that robust AI regulatory guidelines will help build trust. Once scientists, managers and executives can see the value of these tools, they are more likely to integrate AI into their Smart Lab of the Future

Data Management 

Life science technologies have rapidly advanced in the past 20 years. It is now cheaper and easier than ever before to create enormous data sets. However, this brings a number of challenges. 
Kees van Bochove, Founder of The Hyve, shared his perspective on implementing FAIR data principles in biopharma R&D. FAIR stands for Findable, Accessible, Interoperable, Reusable data. 
By following FAIR principles and harmonizing data, companies can break down data silos, facilitate collaboration and make better use of their valuable data. However, this is not easy for biopharma companies with decades-worth of diverse datasets. Van Bochove believes the key to achieving FAIR data is to set up the right enterprise-wide data management infrastructure


The Smart Lab of the Future is closer than ever. To achieve this vision, lab informatics companies need to work closely with scientists and biopharma executives to overcome adoption and data management challenges.



Subscribe to our newsletter

Click me