Some friends have reached out about my last post on machine learning. Some of you expressed enthusiasm that I was embracing lifetime learning, some of you thought I was crazy. Why is an analyst playing with tools and math? No, I am not considering a career change to data science or going back to software development – I am simply curious. It is not even the ML that is at the center of my interest, it is just a necessary step in the journey. What really fascinates me is the opportunity that these tools create in the context of a digital twin. IoT and ML are the raw materials and the tools – the insight is in the repository where we model process and create context. While this might be a database or a data lake the most interesting example of this for me is the digital twin.
I am not the only one that’s interested. The Digital Twin Consortium ( https://www.digitaltwinconsortium.org/ ) is up and running and working on establishing a taxonomy and standards. Every IoT, Supply chain, and PLM vendor is on the bandwagon, and we all have high hopes for the transformative impact these new tools will bring. So why all the enthusiasm – what is driving the move to support digital twins? Think of the digital twin as a source for both current state and historical data on the actual performance of a thing. This thing might be a component (a thermostat or sensor or industrial part) an asset (a truck or machine or building), an employee (service tech), or a process (business or manufacturing). Things experience events, respond to commands, generate alerts, and create detailed data and history. Part of the capability of the digital twin is driven by this complexity of having models of models to describe complex assets, processes, and systems. Think about a digital twin of your supply chain. It would encompass items, packed in containers, moving through the physical world to distributers and customers. This model might inherit data from the process that created the product (supplier, lot#, batch record) at one end of the chain, and inform your customers model at the other. Supply chains and manufacturing assets are just the beginning. As this technology becomes better understood, and deployments become easier use will grow into increasing complex spaces. There is already development of digital twins in life sciences in support of systems biology modeling complex organs like the human heart. Digital twins are also tracking complex human and business processes in large organizations and are becoming areas of interest in strategy simulation and defense.
In manufacturing and supply chain the wealth of complex multi model use cases is one of the most vexing issues with digital twins today – to be inclusive we must have models of models shared across corporate entities to create a view of the entire supply chain from our supplier’s supplier to our customer’s customer. Being able to understand the status and history of our assets and processes allows us to bring machine learning tools into the equation to execute simulations, optimizations, and predictive capabilities to our models. These models can also add contextual data allowing us to understand how we improve resiliency and agility in the face of ongoing and future disruptions. It’s not just our assets that require digital twins, but also our processes, networks, and organizations. The power of understanding, simulating, and predicting are becoming core competencies that every organization must master, and the use of digital twins to drive these insights will be a powerful tool.
To realize the benefits of this tremendous opportunity we need standards, agreed upon taxonomies, and commercial development tools and platforms for this market to flourish. The supplier community is reacting to this opportunity, and many practitioners from the PLM, IoT, and analytics/data science market are beginning to focus on resolving some of these foundational standards.
The large platform suppliers are moving forward with tools and PaaS offerings to try to win share and develop “de facto” standards (as they always do). AWS, GCP, Predix, IBM and Microsoft are all building extensions to their existing IoT tools and platforms to add support for the creation of digital twins. One of the more complete early offerings here is Microsoft’s Azure Digital Twins. Featured at the Microsoft Build 2020 event the preview release supports a new Digital Twin Definition Language (DTDL) based on an implementation of JSON-LD. By leveraging JSON-LD, a well-accepted and simple object framework, Microsoft is supporting an open standard from the beginning. This is a key requirement as users begin to understand that digital twins require an open, object-oriented approach to support the requirements for inheritance, and multiple instances in creating complex multitier models that are portable and support the use of widely available cloud platforms and AI frameworks.
If you haven’t thought about how a digital twin can improve your products, processes, and outcomes you should. This is a technology that is truly transformative for most businesses and it deserves a place on your technology roadmap.
#artificialintelligence, #iot, #digitaltwin, #aiiotsupplychain