For many years the Laboratory Informatics community has been talking about the “Lab of the Future”, “Lab 4.0”, the “Smart Lab” or the “Lab of the 21st Century”. Now we have been in the 21st Century for quite a while – but what has happened so far?
Labs have moved from paper to electronic (paper on glass) and some of them have moved to the digital lab by implementing digital workflows. But if the lab of the 21st Century is supposed to be new and transformative, so this isn’t enough. Labs need to become truly digitalized (using digital technologies to change a business model).
The key to digital – and even more for digitalized lab operations is connectivity to build the foundation for a new way of working.
Many companies have considered – and even tested – new technologies that have come up: Cloud Computing, Data Lakes, the Internet of Laboratory Things, AI and Machine Learning, Virtual and Augmented Reality, Voice Control. Are these technologies relevant for the Lab of the 21st Century? Are they providing any value?
Only connectivity allows organizations to leverage these new advanced technologies and to make an impact on the experience of lab scientists, the productivity of the lab and the re-use of scientific data.
The most basic connectivity in the lab is between data generators like instruments or any application to capture/enter data and data consumers. These are the tools to create reports and documents, to generate analytics and dashboards and to provide secure long term storage of the large amount of valuable data generated in the lab.
But laboratory operations encompass more than this, and connectivity must go much further. Systems to manage samples and chemicals, as well as equipment and personnel are part of the lab environment. It also includes processes like the development of methods and their execution, the preparation of samples and experiments or the analysis, reporting and decision-making. And it must be able to bi-directionally communicate with an organization’s business systems like the Enterprise Resource Planning System (ERP). It sounds complex to integrate all these elements but the benefits are significant.
What is required to enable this connectivity? The basis will be an infrastructure that is platform-based to provide the require backbone, cloud-enabled to ensure an agile collaborative way of working with a low Total Cost of Ownership (TCO) and that can leverage a data lake to ensure long term storage of many different data types. But technology is not enough. Standardization is key. And different aspects of standardization have to be considered.
One aspect of standardization is the actual data format and a standardized framework. A standard data format must define the specifications for a vendor- and technique-agnostic format to store the data and contextual metadata so it allows for long-term and real-time access to the data. The standardization of taxonomies and ontologies (vocabulary) allows for a controlled vocabulary and relationships for the contextual metadata about material, equipment, process, results, and properties. The outcome of the work of pre-competitive consortia like Allotrope and the Pistoia Alliance can be help.
The other aspect is the stewardship and governance of data management and standardization. Organization must make sure that the defined standards are used throughout the labs. This might not always be easy as some scientists might perceive it as an additional burden that reduces their scientific freedom instead of understanding the value of open science where everybody will benefit of exchanging, accessing and understanding each other’s scientific results. And labs should ensure their scientific data are FAIR (findable, accessible, interoperable, re-usable). The FAIR data principles act as a guideline to support the implementation of technology.
The Value of Connectivity
- Enables the liquidity of data and information flow between people and systems, vertically and horizontally
- Helps to overcome data silos and departmental disconnect
- Supports processes like materials characterization, formulations, process development, stability studies, or batch releases in one seamless experience
- Increases lab productivity up to 40%
- Ensures data integrity and improves data quality
- Drives collaboration and data sharing on a global scale
- Allows integration new advanced technology and devices
- Enables data- and knowledge-based decision-making – in real-time
- Creates a new transformative user experience
One Step Further
What if we think a bit further? If we connect with a larger ecosystem beyond the lab? If we connect the lab directly with the suppliers to search, find and procure the right compounds it will reduce the time chemists spend on sourcing by 50% and scientists can work much more efficiently. If we connect to potential providers of chemical and biological synthesis and leverage their information scientists can evaluate the fastest and/or most cost effective way of developing a new product. If we connect to experts in modeling & simulation, we can replace physical testing by virtual testing without having to build the in-house expertise. If we connect to other labs, in-house or externally, we can optimize the scheduling of lab work in unprecedented ways removing the testing bottleneck throughout the organization.
This additional level of connectivity will elevate the productivity of the laboratories across departments, provide deeper insight into work throughout the value chain and drive successful innovation in a complex business environment.
How far are you on your journey into the Lab of the 21st century? To explore this just watch this webinar about “Labs in the 21st Century”.