Double Trouble: Simulating a Digital Twin

by Mark Patrick, Mouser Electronics

Digital twinning is a relatively new phenomenon that utilises the dramatic increases now being witnessed in computing power to radically change the way that systems are designed and developed. The principal idea is to create a digital model of a system that replicates the real system – whether it is a machine, a factory or potentially even a city. These models can then be used in many different ways – from monitoring a piece of equipment to look out for faults before they occur to testing out new configurations on the factory floor. They can even be employed to test out the impact of transport or housing schemes across urban landscapes.

The technology relies on the behavioural models that are already used to develop systems. These models are used to simulate and test the design of a system before it is built, taking test stimuli to validate that the design works as intended. Now these models can be used even after the system is built. By taking the same input and output data as the physical system and running this through the virtual model, the performance of the model should represent the performance of its real-world counterpart. This clearly allows a tremendous range of possibilities. The model can be probed in many different ways to find areas of the machine that are not working efficiently or components that are wearing out, all without having to dismantle equipment or fit hundreds of sensors. Parts can be ordered well ahead of time without needing to be held in stores, and replaced as part of a managed maintenance process so that the machine doesn’t fail unexpectedly and production isn’t disrupted.

But digital twinning can be so much more powerful than just predictive maintenance. It also allows multiple systems to be combined together. This is a key capability for the Internet of Things (IoT), allowing thousands (or even millions) of sensors and actuators to be monitored in a virtual world, flagging the real-world equivalent to deal with arising situations when necessary. This can dramatically reduce the monitoring requirements in large scale IoT deployments.

This is made possible by the increased computational resources being made available in industrial applications. Rather than relying on a desktop machine to run the simulation, the models run in the cloud. This allows models of the machines to be networked together in the same way as the physical machines are networked, building up a digital twin of the entire factory floor.

This bigger model can then be used to test out different ways of arranging machinery within the production/processing facility, optimising workflows without having to disrupt the day-to-day operation. The bigger model can be constantly tweaked across a multitude of parameters that would just not be possible for a real-world system. This can help validate how manufacturing activities will act out and the yields that can be expected before the production lines have even been started. It may also be combined with machine learning and artificial intelligence (AI) to run through the parameters on the digital twin, in order to identify patterns within the system, finding the optimum efficiencies in a way that even experienced human operators simply could not.

This approach can then be used to show how new equipment can be introduced into the workflow in such a way that it enhances the operation of the factory rather than causing bottlenecks or stoppages (as often happens). Staff can be trained on virtual models of the equipment before setting foot in the factory, and the different components and mechanisms within the equipment can be annotated as part of the digital twin to make training quicker and easier.

Once you have digital twinning of buildings, the next logical step is to extend this to the city level. Dutch research centre TNO and its Belgian counterpart IMEC have developed a comprehensive digital twin of Antwerp. This digital 3D replica of the city combines noise pollution data with real-time sensor information concerning air quality and traffic congestion, along with detailed computer models. It offers an up-to-date and predictive view of the situation within the metropolitan area where the impact of planned measures can be simulated and tested. This allows the effects of certain scenarios on citizens to be predicted in advance.

“The possibilities of this tool are endless,” says Jan Adriaenssens, Programme Director for the City of Things at IMEC. “In principle, in the digital twin we can process all data that policymakers find important – from bicycle data to information about the sewerage system. The city council has a digital control room at its disposal to plan measures to improve the quality of life and mobility of their city.”

The implementation of digital twinning technology is being predominantly driven by large system integration firms, such as Siemens, that operate at the level of machines, factories and city infrastructure. It can take device models from its Mentor Graphics design tool division and combine these into hugely complex models. The concept is so powerful that the company has restructured – with one of its three divisions now focussed around the digital factory. The power of cloud computing allows each digital twin to scale up alongside its physical counterpart by adding processor units quickly and easily. Other processor units can then be used to probe and interrogate the virtual model when necessary.

Being able to look deeply into the heart of the system – whether it is the IoT, a complex factory workflow or a multimillion-population metropolis – without disturbing its operation is an immensely powerful concept, and as an industry we are just starting to scratch the surface of what can be achieved with digital twinning.