Cybersecurity experts at the Georgia Institute of Technology have assembled a new tool in the fight against hackers: a decoy robot. They built “HoneyBot” to lure hackers into thinking they had taken control of a factory robot, but instead the robot gathers valuable information about them and helps the business protect itself from future attacks.
The four-wheeled robot is smaller than a breadbox. It can be monitored and controlled through the internet. But unlike other remote-controlled robots, the HoneyBot’s special ability is tricking its operators into thinking it is performing one task, when it’s actually doing something completely different.
“The idea behind a honeypot is that you don’t want the attackers to know they’re in a honeypot,” Beyah said. “If the attacker is smart and looking out for honeypots, maybe they’d look at different sensors on the robot, such as an accelerometer or speedometer, to verify the robot is doing what it had been instructed. That’s why we will be spoofing that information. The sensors will indicate the robot accelerated from point A to point B.”
The Georgia Tech decoy robot is designed to trick hackers into thinking they have taken control of a factory robot, but the machine is actually gathering data on the hackers.
In a factory setting, such a robot could sit motionless in a corner, springing to life when a hacker gains access, a visual indicator that a malicious actor is targeting the facility.
Rather than letting hackers to run amok in the factory, it could follow some harmless commands such as picking up objects and stopping far short of doing anything dangerous.
So far, the researchers’ technique seems to be working. In experiments to test how convincing false sensor data is to individuals remotely controlling the device, volunteers used a virtual interface to control the robot; they could not to see what was really happening. To entice volunteers to break the rules, they encountered forbidden “shortcuts” that would let them finish the maze faster.
In the real maze back in the lab, no shortcut existed, and if participants opted to go through it the robot instead remained still. Meanwhile, volunteers who had now unwittingly become hackers for the experiment were fed simulated sensor data, indicating they passed through the shortcut and continued along.
Researchers wanted them to think the robot had taken the shortcut.
In surveys after the experiment, participants who controlled the device and those who fed simulated data about the fake shortcut both thought the data was believable at similar rates.