Purdue researchers create ‘self-aware’ algorithm to ward off hacking attempts


WEST LAFAYETTE, Ind. — It sounds like a scene from a spy thriller. An attacker gets through the IT defenses of a nuclear power plant and feeds it fake, realistic data, tricking its computer systems and personnel into thinking operations are normal. The attacker then disrupts the function of key plant machinery, causing it to misperform or break down. By the time system operators realize they’ve been duped, it’s too late, with catastrophic results.

The scenario isn’t fictional; it happened in 2010, when the Stuxnet virus was used to damage nuclear centrifuges in Iran. And as ransomware and other cyberattacks around the world increase, system operators worry more about these sophisticated “false data injection” strikes. In the wrong hands, the computer models and data analytics – based on artificial intelligence – that ensure smooth operation of today’s electric grids, manufacturing facilities, and power plants could be turned against themselves.

abdel-kahlik-groupPurdue researchers have developed a novel self-cognizant and healing technology for industrial control systems against both internal and external threats. The project is led by Hany Abdel-Khalik (center) with Yeni Li, a nuclear engineering postdoctoral associate (right) leading the anomaly detection work and third-year nuclear engineering Ph.D. student, Arvind Sundaram, the covert cognizance algorithms implementation. (Purdue University photo/Vincent Walter)
Download image

Purdue University’s Hany Abdel-Khalik has come up with a powerful response: to make the computer models that run these cyberphysical systems both self-aware and self-healing. Using the background noise within these systems’ data streams, Abdel-Khalik and his students embed invisible, ever-changing, one-time-use signals that turn passive components into active watchers. Even if an attacker is armed with a perfect duplicate of a system’s model, any attempt to introduce falsified data will be immediately detected and rejected by the system itself, requiring no human response.

“We call it covert cognizance,” said Abdel-Khalik, an associate professor…

Source…