Securing the integrity of digital twins in encrypted operational environments
As the name suggests, a digital twin is a virtual replica of a physical process or system. It is connected to its real-world counterpart via a continuous stream of live sensor data. For those working in operational environments, a digital twin can flag issues in manufacturing plants, electrical substations, or mines, enabling operators to quickly test changes before rolling them out into the live environment and proactively manage systems.
For example, in mines, throughput adjustments can be tested against a digital model before being implemented across multiple shafts. In power utilities, switching scenarios can be explored digitally before engineers act in the field. Essentially, the twin informs what happens next.
That means data is critical.
Information flows in from PLCs, SCADA systems, remote sensors, and engineering workstations. When that flow is steady, the model tracks the environment closely. If telemetry is delayed or arrives out of sequence, nothing breaks. However, the discrepancy may only come to light later when the outcomes do not match expectations.
“At the point where the digital model influences operational decisions, integrity becomes an operational issue,” says Nithen Naidoo, CEO of Snode Technologies.
Most of the traffic between those systems is encrypted. This includes remote access sessions, firmware updates, and telemetry from distributed assets. In South Africa, where infrastructure often spans large, remote areas, encryption is standard practice.
However, encrypted channels are also used to carry out malicious activity. The conventional response has been to decrypt traffic at the network boundary for inspection and then re-encrypt it before it proceeds. That is a common approach in enterprise IT.
Industrial control systems do not always accommodate that model comfortably. SCADA networks and PLC-controlled processes are engineered for consistent timing and predictable behaviour. Changes in network flow can have significant consequences.
In these environments, extra devices are introduced to take care of the decryption and encryption process. In turn, these devices must be secured, patched, and monitored. Over time, what began as a protective layer becomes another form of dependence.
A digital twin relies on steady patterns between systems. If latency shifts or routes change, even subtly, the twin is no longer observing a natural state.
Research into digital twin ecosystems has begun to look at this expanded exposure. As twins incorporate IoT telemetry and machine learning models, they inherit the weaknesses of those inputs. Data poisoning and manipulation are now practical concerns in distributed systems.
The question now turns to how do organisations look for malicious behaviour without altering the communication patterns that the twin depends on? One answer is to examine behaviour rather than content.
So, instead of decrypting data packets, a behavioural approach looks at how systems interact. Which devices communicate, how frequently do they communicate, and does anything change? Any deviations become a potential trigger for concern.
This aligns more closely with Zero Trust thinking. In a Zero Trust model, relationships and identities are continuously evaluated. When digital twins ingest data across IT and OT domains, those relationships matter more than the content itself.
Snode has been building network-level digital representations of system relationships for some time. The company’s recently granted US patent formalises a method for analysing encrypted communications without decrypting them. This is not about bypassing encryption but observing patterns while preserving the communication.
“In operational environments, stability is designed in. Security controls should not undermine that,” adds Naidoo.
Just think about a processing plant in which a digital twin models conveyor speed as a function of ore quality. The twin relies on telemetry from multiple PLCs and sensors. If the infrastructure responsible for conducting inspections alter timing or introduce new failure points, the integrity of that relationship changes. In a substation environment, switching simulations depend on predictable SCADA input. The same principle applies.
Digital twins are increasingly treated as operational assets. This means that governance becomes a critical element. Questions like who validates the data and how is trust maintained across distributed inputs are important considerations. Companies must therefore ensure that the digital model remains aligned with the physical system it represents.
This does not mean that the problem is encryption. It is stability and visibility. The architecture used to manage this must determine whether digital twins remain dependable.
“As deployments expand across South Africa and globally, the integrity of digital twins will not be defined by modelling sophistication alone. It will be shaped by the security decisions made in the network layers beneath them,” concludes Naidoo.