Digital twins – A brand new 40-year-old market

In discussing the emerging digital twin market it is important to note that this is in many ways an evolution of a software market founded in the 1980’s and 1990’s focused on the collection of critical process and control data in many industries.  There have been several generations of software suppliers in this space and many have been acquired by companies like Siemens, ABB, and AVEVA – remember Wonderware?  The first data historian system I worked on ran on a DEC VAX/VMS system in the late 1980’s at a defense aviation manufacturer.  These operational historians are usually built as a time series “database” collecting operational DCS and PLC data.  Streams of operational process data or tags were usually collected and arranged as sequential readings of individual controls or sensor values.

Time series databases are now back in vogue as machine learning creates opportunities in process management, predictive maintenance, computer vision quality control, commercial and consumer recommendation engines, regulatory compliance, and a host of other opportunities to introduce intelligence into business processes.

Time series is an important dimension and there are a number of contemporary time series databases including open source products from the Apache Software Foundation (Druid, Kudu, Prometheus, Graphite), GNU GPL products (InfluxDB, RRDtool), and commercial solutions (IBM Informix time series, eXtremeDB)

There is a remarkable range of options for building historians, especially as part of IoT deployments.  The challenge is many of these products cannot manage the high scale, high availability requirements of capturing millions of timeseries datapoints.  In these scenarios large commercial suppliers that can manage millions of tags include OSIsoft Pi and Aveva eDNA.  AWS, Azure, and GCP are all focusing on improving the scale, availability, and cost of their offerings in this market as the value of this data in machine learning and enterprise operations is becoming more widely understood and this is now a rapidly evolving market.  Most current offerings here are available as cloud services and many are being supported by the large cloud IaaS suppliers.

Operational historians often support enterprise historians – Aggregations of data from multiple operational historians designed to support enterprise business analytics and planning.  The relationships between plant or facility data from sensors and controls into usable performance data at the enterprise level is where the discussion moves from discrete data to models of operational and product performance and reinforces the idea that to be useful this data must have the ability to be used in the model of models that make up the digital twin.

This convergence of AI, IoT, and digital transformation has driven the notion of process and data historians back to the forefront of contemporary thinking in the development of operational systems.  System requirements now include topics like:

Preventive/Predictive Maintenance – reducing cost and downtime

Generative Design – reducing cost, adding context, and increasing capability

Optimizing Supply Chains – creating better risk management and improved resilience

Computer Vision – real time observation, quality processes and improved output

This understanding of history is important for new entrants in the market, both from a supplier and a customer perspective.  There are many lessons to be learned from the hard-won industrial best practices for these applications that cut through the hype of new market offerings.  Control systems are not always well understood by the current generation of IoT / AI centric vendors entering this space, and it is important to vet these offerings for proven abilities to scale to billions of data points from millions of devices.  Not every vendor needs to reinvent the wheel.  Contemporary Industrial IoT solutions need to consider the 4 V’s of big data – velocity, volume, variety, and veracity.  Legacy suppliers here have significant proven capabilities in the velocity, and volume side of the equation.  To survive and compete they will need to build out capabilities in the variety (context) and veracity (trust) parts of the equation while focusing on the fifth V – Value. This is a big part of the reason Emerson Electric, the technology, and engineering behemoth agreed to buy Open Systems International for $1.6 billion cash at the end of last month. Emerson is looking to digitize it’s grid operations and they recognize the value a proven high scale solution brings

Innovation in this market comes from improving access, organization, analytics, and insight.  Modeling the data in ways that produce new understanding, innovation, and customer value is the key requirement in today’s market.

Scott Lundstrom

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s