Tesla Autopilot Tricked to Drive the Wrong Way

Hackers fooled a Tesla Autopilot feature on a self-driving car to drive left-of-center and in the wrong lane.  The hackers did this with low tech means; remotely through root control and a few stickers. 

Researchers (the hackers) published a recent report which explains how the Tesla Autopilot system engine control unit (ECU) was compromised through root security weaknesses in software version 18.6.1 to gain remote control of a Tesla Model S steering wheel.  These researchers have been hijacking Tesla DNS traffic on D-Link routers for three months.  The white hat hacking team was able to dynamically inject malicious code into controlling mechanisms to remotely take control of the steering wheel from a mobile device.  The mobile device was connected to a gamepad via Bluetooth for approximate steering control.  While in APC (Automatic Parking Control) mode, the researchers were able to seize control of steering at roughly 5 mph.  When driving at high speeds, there were no limitations.

After analyzing the controller area network (CAN) messaging functions onboard the car’s system, the researchers were also able to tamper with how the vehicle recognized traffic lanes.  While in the Tesla Autosteer mode, the vehicle uses computer vision and camera feeds to detect and navigate these lanes, but "a potential high-risk design weakness" permitted the team to lead a Tesla car in the wrong direction.  Researchers at Tencent tested out the hacker’s theory by applying some simple stickers to a road surface, and this confused the machine vision system enough to go silent and, theoretically, could be used to divert these cars into oncoming traffic.  The problem was within the single neural network which Tesla uses to detect lanes, among other functions. Images from a camera are processed, input into the network, and output is then saved and added to a virtual map of the vehicle's surroundings.[1]

With a gamer’s steering wheel managing the car's auto-steering decisions, the researchers created an attack scenario in which the feed images were compromised by way of three stickers on the road, which led to the car's trajectory changing.  By applying small, inconspicuous stickers to the road, the system failed to notice that the fake lane was directed towards another lane.  The vulnerability and security weaknesses found by Tencent were reported to Tesla and have now been resolved.  A Tesla spokesperson said the attack, "is not a realistic concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should always be prepared to do so."  So much for the self-driving car concept.

Tesla is an electric car, which has more vulnerabilities.  There is a constant flood of security concerns when it comes to current and future our electric vehicles.  Modern attack paths may include exploits relating to Bluetooth, cellular connections, WiFi, vendor interfaces, and external storage such as USB drives.  With the majority of exploits on hand considered to be of "medium" severity.

Hackers continue to focus on the automobile industry, in this case trying to help the auto company.  How far in the future is the advent of self-driving cars?  An unnamed industry cyber threat professional and source for Wapack Labs,  works within the self-driving automobile industry, indicates that in 5 years, self-driving vehicles on dedicated routes will be a reality.  Time, technology and absolute safety will determine full production.      

Wapack Labs is located in New Boston, NH.  We are a Cyber Threat Analysis and Intelligence organization. For questions or comments regarding this report, please contact the lab directly by at 1-844-492-7225, or feedback@wapacklabs.com         

[1] https://www.zdnet.com/article/hackers-reveal-how-to-trick-a-tesla-into-steering-towards-oncoming-traffic/

You need to be a member of Red Sky Alliance to add comments!

Comments

This reply was deleted.

Resources

 

CASE STUDIES