Guiding Principles for Building a Smart Laboratory

Building a Smart laboratory involves a multi-disciplinary approach and a commitment to move away from traditional paper processes. Many day-to-day, routine laboratory operations are still carried out using manual, paper-based systems for everything from recording test results to managing inventories to documenting and scheduling instrument calibrations. This reliance on paper is not just expensive, it also decreases the efficiency of the laboratory workplace and leaves the laboratory vulnerable to having valuable data misplaced, lost or destroyed.
Technological advancements in wireless networks, laboratory equipment, sensor technologies, micro-electronics, informatics software and other areas have helped to spawn a digital revolution in connectivity in the modern laboratory. Forward-thinking organizations are capitalizing on this digital revolution to create highly integrated and automated smart laboratories that help facilitate product quality and efficiency in manufacturing processes, as well as enhanced innovation and product development through data analysis, process monitoring and continuous feedback.
A smart laboratory, in the context of this blog, is a laboratory that uses the latest technology (e.g., big data, internet of things, paperless processes, automation, cloud technologies, informatics software, etc.) to manage and automate scientific activities in ways that optimize both scientific research and manufacturing. Benefits incurred by use of these smart technologies in the lab include enhanced process efficiency, data integrity, innovation, product safety and cost-effectiveness, and happy scientists who are freed up to focus on doing science. What follows are some best practice recommendations for creating smart laboratories in your organization.
Guiding Principles for Building a Smart Laboratory
A laboratory may have a number of different data systems associated with the main analytical instruments/techniques, such as chromatography, MS, NIR, UV, etc. On the surface, such a lab appears to be very effective because of the hard science being done. In practice, however, these instruments are islands of automation in a sea of paper, as the main way that data is being transferred from system to system is still manual input using paper as the transport medium. In addition, such a lab usually has processes that have evolved over time that are very slow and inefficient, involving tasks that do not add a lot of value to the laboratory output.
In transitioning to a smart laboratory, there are a number of best practice recommendations that should be followed including:
Evaluate Work Processes. In considering a smart laboratory, the first step is to do a workflow analysis in order to understand the basic lab processes and computerized systems and how they fit together. This analysis will involve interviews with laboratory analysts and other stakeholders to document the current state workflows.
Once the current state workflows are documented, the next stage involves designing the optimized future state. The main goal here is to find the root causes of bottlenecks and issues in the workflows and design improvements to the process utilizing the relevant technology to increase automation and support innovation. The future state process should have, when practical, electronic ways of working that deliver business value with efficient hand-offs and transfers between applications and organizational units. The final result of an effective business process analysis is a set of optimized future state requirements that will be used to guide laboratory IT architecture.
Design the Laboratory Architecture. Before engaging in any technology selection or implementation activities, it is important to design a laboratory IT architecture that is aligned with business goals, along with a practical roadmap to deployment. The architecture will detail all the different systems being deployed, the integration plan, the volume of data generated, and where the data will be stored (i.e., in an archive system with the individual data systems, or on a networked drive).
Design a Strong Network. Working electronically requires reliance on the IT infrastructure and support systems. If the network fails, or one element breaks down, laboratory work can be severely impacted. Characteristics of the network will include:
- Redundant network cabling – cables, switches and routers need at least two routes.
- Sufficient network bandwidth (capacity) to handle laboratory data. What type of data files will you be working with? Are your files 50kB CDS files or 1GB high-resolution NMR files? Network capacity needs to be sufficient for your needs.
- Computer hardware, especially servers and data storage devices, must be resilient and fault tolerant – dual power supplies, dual network and redundant disk storage.
- Power backups in place in case of breaks in electrical supplies (not only in the computer room, but the communication cupboards for switches and routers) to prevent loss of data in transit.
- Backup and recovery systems to ensure data are not lost.
Capture Data at the Point of Origin. One of the key concepts that should be designed into a smart laboratory where possible is to acquire data in an electronic format at the point of origin. While this is an important goal to strive for, the principle of integration must be balanced with the business reality of cost-effective interfacing. There are a wide range of data types to consider including observational data (e.g., odor, color, size), instrument data (e.g., pH, LC, UV, NMR, etc.), and computer data (e.g., manipulation or calculation of previous data). Questions to ask to help determine the feasibility (in terms of ROI) of capturing the data at the point of origin include:
- What are the data volumes?
- What are the number of samples?
- What is the frequency of instrument use?
Eliminate Transcription Error Checks. The key point when designing electronic workflows is to ensure that once the data is acquired it is not printed out or transcribed again, but transferred electronically between systems using validated routines. Manual error checks and transcription of data should be eliminated as much as possible with simple electronic workflows that transfer the data seamlessly between networked systems. Automatic checks should be implemented within the deployed systems to ensure the data is transferred and manipulated correctly. Where appropriate, security and audit trails for data integrity should also be implemented.
Create Integration Standards. The reality is that any smart laboratory today will consist of a variety of systems that were not necessarily designed to work together, and for which interoperability is hampered by a lack of data format standardization. The value of automation is limited if data is simply collected but not used. The goal is an integrated lab that is modular, based on standards, and is designed to facilitate data sharing, connectivity and collaboration.
Towards this end, it is important to establish integration data standards in the laboratory to facilitate interoperability between systems. In order to address this issue, a number of pharmaceutical and biotechnology companies came together in 2012 to form the Allotrope Foundation with the intention to pool their collective expertise and resources to develop a solution to commonly experienced data management issues. The Foundation recently released the Allotrope Framework – a suite of software tools that allow software developers to implement a consistent set of data standards into the software that laboratories use to manage their workflows and data.
While the Allotrope Framework was designed for pharmaceutical industries, the overall methodology implemented in the Framework is very much applicable to other industries and could be used to guide the customization work required to connect disparate systems and implement an integrated smart laboratory.
Make sure the data is secure. Security of the data and data backup are of critical importance in an electronic environment. Privacy and security have become significant challenges to address, particularly as data moves out of siloed laboratory systems and into cloud-based environments. Fortunately, cloud providers have taken major steps in recent years to ensure protection from intrusion. Data protection and strong encryption in the cloud is now both possible and available through a number of cloud providers.
When working in an electronic environment, it is important to ensure that “trust zones” are established and that appropriate governance is applied to protect confidential information. Tools can be developed and utilized to monitor portions of the operation and provide dashboards and potential intrusion alerts.
Conclusion
The digital revolution is rapidly changing the laboratory environment. New technology tools and processes are being utilized to drive smart, decentralized manufacturing operations, while integrated and IOT connected R&D environments are becoming increasingly flexible and innovative. Companies focused on building smart laboratories that utilize the latest technologies to manage and automate activities in scientific research and manufacturing stand to gain a significant competitive advantage. Smart laboratories in scientific enterprises can help to facilitate greater agility, cost-efficiency, improved innovation and compliance, and much more.
Building a smart laboratory is no small undertaking, however. A project of this nature takes careful planning and a proven methodology to ensure success. If you would like to have an initial, no obligations consultation with an Astrix informatics expert to discuss your smart laboratory initiative or your overall laboratory informatics strategy, please feel free to contact us.