A good way to think about digital twins is as a flight simulator for the business. Maneuvers and corrections are made within the safe and relatively inexpensive confines of cyberspace, buttressed by the inflow of the latest data. And, importantly, they are AI models in their own right.

The global market for digital-twin technology will grow about 60% annually over the next five years, reaching $73.5 billion by 2027, a recent analysis published by McKinsey indicates. At least 70% of C-suite technology executives at large enterprises are already exploring and investing in digital twins, the consultancy also finds.

They are growing more popular due to “their ability to provide greater operational efficiency, reduce cost, meet demand for more intelligent data analytics and gain a competitive advantage,” Dale Tutt, vice president of industry Strategy at Siemens Digital Industries Software, told me.

However, there still may need to be more work done to advance digital twinning into many companies. “Digital twins are less widely adopted than you would think, given that they can lead to tremendous efficiency gains,” said Manfred Kügel, data scientist and IoT industry advisor for SAS. “A major roadblock is building a digital twin for the wrong purpose. Digital twins are very powerful for the narrow set of tasks for which they are built. A digital twin that optimizes product quality based on input materials and suggests optimum process parameters will not be able to optimize other things such as equipment wear and tear.”

Digital twins are “an absolute game-changer once deployed and developed successfully, but the benefits are locked behind enormous costs in the adoption process of such systems,” commented Mark Pierce, CEO and founding partner of Wyoming Trust, a law firm. ”Thus, costs, complexity of operations, and shortage of available and skilled workers represent some of the biggest roadblocks to digital twin deployment.”

Just as flight simulators digitally replicate complex aircraft components intermingled with mock events and atmospheric conditions, digital twins were first embraced by “industries with complex products or that rely heavily on automation in their production lines, like aerospace, automotive and the semiconductor industry,” Tutt explained. “Whether for engineering or manufacturing, these industries are using digital twin to model product performance as well as production line performance.”

What does it take to build and support digital twins? For starters, a digital foundation is essential. “The technology pillars of a digital foundation include tools for developing 3D geometry, data collection, storage, analysis, visualization, security and a robust product lifecycle management system to maintain the configuration of the data that is collected,” said Tutt. “This data provides the necessary context to link the 3D geometry to simulation models and finally to the organization’s requirements.”

At the same time, “huge data volumes needed to feed the digital twin,” Kügel cautioned. Essentially, “digital twins are machine learning and AI models, embedded in low-code UI interfaces and paired with no-code visualization tools. You need to build digital twins on a stable AI platform that automates the building, training, deployment and maintenance of analytical models. Of course, using digital twins is a process and not a one-off task; you never stop training, deploying and re-training these models.”

Digital twins also require organization-wide engagement beyond the IT and data departments, as well. Kügel urged, extending to domain experts.

There are interesting ways to engage with digital twins as well — such as through extended reality or virtual reality (XR/VR), versus simply looking at a screen. XR/VR technologies “enhance the visualization, interaction, and utilization of digital twins, leading to improved design processes and efficient operations,” Tutt said.

Still, the complexity of digitally replicating large systems can get in the way as well. “Artificial intelligence offers a solution to overcoming these roadblocks, especially by enhancing data quality, enabling scalability, facilitating user adoption and optimizing real-time data processing,” said Tutt.

Share.

Leave A Reply

Exit mobile version