Challenge 4: Improve standardization of procedures and data

June 17, 2024

 

This blog is part of the blog series: Top7 Challenges for R&D Labs Operations and Digitalization: Insights from our Clients.

 

The Power of Standardization

In today's data-driven world, standardization has become a critical cornerstone for organizations seeking to enhance operational efficiency, ensure data integrity, and unlock the transformative power of artificial intelligence (AI). Comparing and cross-analyzing data across different sites, departments, or even industries requires a consistent and unified approach to procedures and data management.

The ability to leverage data effectively is directly proportional to the quality and standardization of the information at hand. AI/ML systems, which thrive on large volumes of high-quality data, can only deliver their full potential when fed with standardized and consistent inputs. Recent efforts have been made to prioritize standardization across various organizational levels, recognizing its pivotal role in maximizing the value of data assets.


Scientists have traditionally been accustomed to using field-specific or personal nomenclatures and structures for data collection. However, this practice hinders collaboration and reduces the accessibility of data for further analysis. To address this challenge, an increasing number of organizations are striving to implement FAIR (Findable, Accessible, Interoperable, and Reusable) data capture principles.

At the moment, our different sites do the protocols in a slightly different way, in one site, scientists will measure in a specific units and then in the other site, they would use another unit for a specific protocol. Users don't always note down the units they are using and then the data cannot be compared between the different sites,” declares a Senior Scientist from the Food & Beverage industry.

The primary strategy employed to enforce standardized data capture is the rollout of Electronic Lab Notebooks (ELNs) and Laboratory Information Management Systems (LIMS). These platforms host protocols and create templates with drop-down menus to constrain standard nomenclatures. This approach has proven effective in standardizing data across organizations. However, the challenge lies in the time-consuming and often cumbersome process of using these systems during active experiments.

Scientists frequently need to pause their experiments to log in, navigate through various fields, and select the desired answer from drop-down menus. Alternatively, they may need to fill in the information later based on their recollection, which can introduce errors and inconsistencies (read our dedicated white paper: Why ELNs Need a Helping Hand… or Voice?)

Moreover, ELN and LIMS templates present an additional hurdle for Research and Development (R&D) labs, which frequently develop new methods and need to capture unexpected observations.

In R&D the primary goal is to design something new and different, but you need standard templates, so that’s the common friction point. It is hard to fit both FMCG category Ice Cream and Shampoo in a same ELN template therefore, we always customize some parts,” explains a R&D Digital Transformation Leader in a FMCG industry (learn from the panel discussion with FMCG industry leaders)

While standardizing data is crucial for collaboration, analysis, and leveraging the potential of artificial intelligence (AI), the current strategies face usability challenges, particularly during active experiments and in R&D environments where flexibility is essential. Addressing these usability concerns is key to ensure widespread adoption and effective implementation of standardized data capture practices.

 

Isn't There a Better Way in This Rise of AI?

Ontologies and knowledge graphs are being developped to link different terminologies and entities together and bring a standard structure to the data. Combined with the power of AI, these modern technology are explored to clean up and leverage archived unstructured data and extract insights from past data.

However, little effort is dedicated today into making the data capture smarter at the source, to structure and standardize it without burdening the scientists with too much data wrangling and risking the reduce the richness of the captured data.

By developing a voice-powered digital lab assistant to support scientists at the bench with hands-free data capture and guidance, LabTwin is uniquely positioned to provide comprehensive support in data processing by leveraging the AI component embedded within its product (watch our recorded webinar Next-Gen Data Capture: Getting FAIR Data without Impairing Scientist’s Efficiency).

When a scientist is capturing data, the LabTwin AI can recognize different entities (reagents, instruments, observations, …) and label the data accordingly, enabling automatic organization into structured reports or seamless integration with pre-made templates.


Moreover, as LabTwin can engage in two-way communication, it can ask for confirmation when unsure about the best match or notify the user about missing information in specific fields, ensuring completeness and accuracy (have a look at our dedicated LabTwin AI Webpage).

 

GIF Report Entity Recognition

LabTwin's embedded AI can automatically recognize entities such as instruments in the captured data, as well as label different types of information (observations, ideas) and organize them into a comprehensive structured report. This approach helps managers and collaborators quickly access and comprehend the information while saving significant time for the scientist.

 

One of AI's primary roles is to simplify the interaction between humans and machines. By leveraging this technology, LabTwin's products can build a better user interface that facilitates the lives of scientists running experiments in the lab. Through seamless integration of AI-powered data capture and processing, scientists can focus on their core tasks without being overburdened by excessive data management responsibilities.

Furthermore, the structured and standardized data captured by LabTwin's AI-driven approach can be seamlessly integrated with ontologies and knowledge graphs, enabling effective linking of terminologies and entities across different domains. This synergy between AI-powered data capture and semantic technologies paves the way for advanced analytics, cross-domain collaboration, and the unlocking of valuable insights from previously siloed or unstructured data sources.

 

Standardizing Procedures: Balancing Integrity and Efficiency

Replicating an experiment is a cornerstone of scientific research because it is what makes results robust and credible. However, multiple variants of protocols often coexist (e.g., different HPLC methods across sites), or even two individuals running the same protocol may have slightly different methods. This creates time inefficiencies, either by requiring experiments to be repeated to more carefully match the original protocol or by necessitating time-consuming data corrections to enable result comparisons and facilitate collaboration.
In a Good Practice (GxP) environment, this standardization is even more essential to provide quality assurance of processes and obtain validation from audit organizations.
Standardized procedures are also invaluable for efficiently onboarding and
training new scientists and assessing their competency on a single, standardized method.

ELN and LIMS provide shared protocol libraries with versioning to support this standardization process. However, their accessibility at the bench remains a challenge, and scientists often resort to printing protocols on paper, which they carry around the lab. Printed protocols lose their inherent ability to stay up-to-date, as they might be reused multiple times without checking for potential updated versions shared by collaborators. Additionally, checking the next steps on a printed protocol requires interruptions from the experiment flow, and scientists might often chose to jeopardize the integrity of the procedure in the pursuit of efficiency when running the experiment, leveraging their memory. Unfortunately, errors can easily occur when scientists try to remember the next steps instead of carefully consulting the protocol.


Lab Situation - Be guided in a non-routine process
Scientists in the lab need to constantly balance integrity with efficiency as checking a protocol on paper or on the ELN/LIMS requires interrupting their experiment.

Having a digital, dynamic, shared protocol library is the first essential step toward standardizing procedures. However, the interface is still not appropriated for scientists working in the lab while wearing gloves, conducting time-sensitive experiments, and requiring their full focus.

By using our AI- and voice-powered digital lab assistants, scientists can access the latest protocols from their lab informatics systems, convert them into voice-friendly flows, and receive step-by-step verbal guidance in the lab without removing gloves or interrupting their experiments, ensuring both integrity and efficiency.

I saved 30min per day by being guided by LabTwin for loading the reactor instead of using paper,” stated a Technical Service Chemist at a Swiss Chemical Company (read the case study).

Scientists can also ask questions and receive AI-powered answers about protocol details or associated safety precautions.
Moreover, data capture prompting steps or confirmation steps can be added to the protocol, especially for Standard Operating Procedures (SOPs) in the GxP environment, further ensuring that each step has been thoroughly followed and necessary data is accurately captured in the correct format.

 

LabTwin Screenshot Protocol Guidance

Protocols can be followed, hands-free, through verbal guidance with interactive steps to support the data capture or procedure integrity. 

By leveraging AI- and voice-powered digital lab assistants, LabTwin empowers scientists to seamlessly integrate standardized procedures into their workflow, eliminating the need to juggle between physical documents and lab operations. This approach not only enhances efficiency but also ensures the utmost integrity of experimental processes, data capture, and regulatory compliance, ultimately driving scientific excellence and accelerating innovation.

 

No Compromise Between Integrity and Efficiency

As organizations continue to recognize the immense value of data and AI, the pursuit of standardization has become a strategic imperative. By aligning procedures, data formats, and practices across various domains, organizations can unlock the true potential of their data assets, leverage advanced analytics capabilities, and drive meaningful outcomes that fuel growth, efficiency, and innovation.

Reducing variability is a proven way to reduce costs by leveraging synergies across sites and scientists. The path forward lies in increasing the standardization of procedures and data and when leveraging the right tools, a high level of integrity can be achieved without impairing scientists' efficiency.

Book a short call with our experts to discuss your current challenges and how we can help you overcome your standardization challenges, enabling you to achieve both integrity and efficiency in your laboratory operations.

stay updated

Subscribe to our newsletter.

Ellipse 136-2
Ellipse 52 (Stroke) (1)-1