Lab of the Future is Now

Team Talk

How to Ensure Rich and Standardized Documentation?

Célia Gasselin
Célia Gasselin January 5, 2022

In this era of data-driven decisions, harnessing predictive machine learning models is the ultimate goal of most organizations. ​To build such models, high-quality data and metadata must be captured at the point of experimentation. The challenge is to enable seamless data capture for scientists while ensuring as much standardization as possible to facilitate data integration.

We invited R&D IT experts from GSK, PPD, and Eisai to discuss their current strategies for shifting documentation to a standardized and seamless process that meets data requirements without burdening scientists working at the bench.

 Listen to the whole discussion here.

Finding the Balance between Flexibility and Standardization

Paper has always been the favorite documentation tool of scientists for its flexibility. However, the digital transformation has urged scientists to use lab informatics platforms such as Electronic Lab Notebooks (ELNs) and to standardize their data entry.

Unstructured ELNs

The first step towards digitalization is to implement the paper-on-glass solution by enabling users to enter their documentation on free text sections of an ELN. However, this data remains mostly unstructured and therefore is challenging to integrate across experiments, time periods or sites. Unstructured and unlabeled data is also difficult to access during audits or subsequent studies. There is a higher risk of this data being ‘lost’, leaving companies unable to unlock the maximum value from their data stores.

 

“There is a difference between electronic data and digital data. We need to make sure it's digital in order for us to use it beyond what you could with a paper entry.”

Ryan Ellefson, Associate Director, Lab Informatics at PPD

 

Structured ELNs

To enable downstream data reuse, ELN templates can structure data collection and ensure that every important parameter is captured. Some flexibility can be maintained by allowing scientists to add or remove capture fields on-the-go and adapt pre-scripted templates to their lab workflows.

 

“Structured capture of useable information makes record keeping easier and also allows insight at the forest level without having to go into the individual experiments and look at every single tree.”

Jennifer Heymont, Associate Director, Scientific Informatics at Eisai Pharma

 

Standardized vocabulary

Standardized definitions and vocabulary are essential to being able to integrate and compare data generated at different times or across different sites. Using pre-scripted ELN templates can help standardize vocabulary across an organization.

 

“In our old LIMS system, we had 16 different ways of writing the unit of measure percent weight/weight. 16! Now we have an enterprise reference data management system that will hold all our standardized vocabulary centrally and then any system can call on it.”

Penny Smee, Senior Product Owner at GSK

 

Furthermore, to ensure higher data integrity, informatics systems can automatically check the consistency of the captured data with existing data, thus preventing errors and ensuring data are linked from the point of entry.

Mobile assistants

Accessing a computer to collect data during an experiment will always be a challenge. Mobile technology is the obvious solution. Mobile assistants can move with scientists around the lab and collect data while scientists focus on their experiments.

 

“Our vivarium has been first to adopt mobile. They're all using iPads to do their daily logging.”

Jennifer Heymont, Associate Director, Scientific Informatics at Eisai Pharma

 

Voice-powered digital lab assistants offer a flexible, natural interface for data entry through voice. Digital assistants also leverage natural language processing to recognize entities and automatically structure data within templates.

 

“At GSK, we're certainly looking to drive that forward next year and really supplement the ELNs we already have.”

Penny Smee, Senior Product Owner at GSK

 

Bypassing Manual Data Entry

The need to capture ever increasing amounts of data can place a real burden on scientists. One way to alleviate this burden is to automate some of the data and metadata capture processes.

Barcodes

Barcodes are the most popular way to automatically collect metadata around the use of samples, reagents or equipment and attach it to corresponding experiments. However, barcodes must still be manually scanned by the scientist to be associated with the current experiment in the system of record.

Radio-frequency identification

With radio-frequency identification (RFID), companies can automatically track samples and reagents throughout their lifecycle without the need for manual data entry. Moreover, RFID allows companies to keep an active inventory, always up-to-date with the latest location of each item. In addition, RFID can interact with equipment, such as balances to enrich metadata and facilitate IoT interactions. Therefore, exploring the potential of RFID is on the roadmap of several organizations.

Internet of things

 In the process of making the lab more connected, companies are setting up a network of connected laboratory devices as part of the internet of things (IoT) to automatically capture data and metadata.

 

“Looking ahead to the Internet of Things, one of the huge benefits of that path is that it gives you the ability to get metadata without asking scientists to take on the overhead of recording it.” Jennifer Heymont, Associate Director, Scientific Informatics at Eisai Pharma

 

Smart assistants

 By constantly following the scientist through her experiment, smart digital lab assistants can automatically interlink the captured data and enrich it with contextual metadata. Moreover, smart digital lab assistants are able to structure this data in a standardized format and ensure its completeness. Complete standardized datasets enriched by metadata improve quality control, validation and troubleshooting.

“LabTwin’s digital lab assistant enables seamless hands-free data capture at the bench through voice recognition and automatically structures the data to facilitate integration.”

Célia Gasselin, Marketing Communication Manager at LabTwin

 

Accessing Data

One of the powerful benefits of standardized data is that it becomes useable for querying, sharing and building data models.

Dashboards and prediction models

Standardized data stores can automatically integrate raw data and metadata. Dashboards can then be generated so scientists and managers can easily access information, make data-driven decisions and optimize processes.

 

“GSK’s mindset is we want to be able to predict fast. We want to stop scientists doing 60 experiments and instead use data to say this is the right first experiment, not iteratively get to that understanding.”

Penny Smee, Senior Product Owner at GSK


Visibility for audit

Being able to resurface the original data during an audit is a challenge which needs to be tackled. Standardized, integrated and labelled data becomes easily accessible and visible for audits or any other future use. This allows companies to extract maximum value from their data.

 

“Whenever we have requests from Intellectual Property, we want to be able to follow up on a compound and not just go to the screening data, but back to the original experiment, in a meeting with ease.”

Jennifer Heymont, Associate Director Scientific Informatics at Eisai Pharma

 

Sharing with external partners

Research is often a collaborative process between different partners. However, a lack of common data standards can make it challenging to share data between databases.

 

“The thing that needs to change as a CRO industry is how we do data delivery.”

Ryan Ellefson, Associate Director, Lab Informatics at PPD

 

Standardized vocabulary and data structuring can help partners share data. Machine learning tools can also help to harmonize various datasets.

 

Turning Scientists into Data Champions

To collect standardized data from the point of capture, companies must bring scientists onboard and bring awareness of the impact of good/bad data and an understanding of how the data is used downstream. This will make scientists appreciate the value of their data and help them align processes with data requirements.

“By having our teams work together in a more integrated fashion, the biologists can see how the chemists are using the data and understand how their choices around data quality and data presentation affect the ability of the chemists to use that data to make decisions.” Jennifer Heymont, Associate Director, Scientific Informatics at Eisai Pharma

Conclusion: Working Towards Virtual Workflows

Connecting laboratory instruments and informatics systems into one ecosystem will help companies create virtual workflows. Smart digital lab assistants can connect with both instruments and informatics systems to provide a single, flexible, user-friendly interface from which scientists can control their experiments.

 

“For me, the future of data collection will come when we remove the world of ELN and LIMS and we dream up of a more digitized workflow-oriented system that allows you to float those workflows in and out in a very flexible manner.”

Penny Smee, Senior Product Owner at GSK

 

 

 Listen to the whole discussion here.

 

Labtwin_lab_of_the_future

Subscribe to our newsletter

Click me