Following news that most Honda cars from 2012 onwards can be remotely unlocked and even started by hackers https://www.
"This attack is even worse than the “rolljam” security flaw that Samy Kamkar famously demonstrated in 2015. At least with that attack, the car and remotes implemented “rolling codes” that change with every transmission to prevent simply intercepting and replaying the same code over and over. With this new attack on the Honda vehicles, once an attacker captures the codes, it effectively gives them indefinite access to control a specific car’s lock, unlock, and in some cases remote engine start functionality. Honda’s comments that exploiting this vulnerability would take "determined and very technologically sophisticated thieves" seems to be minimizing the issue. It’s similar to yelling your password across a room and hoping no one happens to be listening. Yes, someone has to be close enough to hear and then know what to do with it, but after that it’s very simple to exploit.
"Honda’s unwillingness or inability to correct the issue in older cars is going to be part of a class of problem that I’m afraid will become massive over time. Manufacturer’s abandonment of correcting security issues in otherwise completely functional devices is going to be a huge problem. As more and more devices add “smart” functions, it’s inevitable that there will be vulnerabilities discovered that put those devices or data at risk. If there’s no patch available or worse not even a mechanism to patch, users will have to choose whether to go at risk of exploitation or trash the vulnerable device. Neither of which is an ideal situation.
"First is what I see is a glaring, however widespread, lack of controls for this particular system by using static codes, especially since this is for a threat that I’d consider if not a well-known “Solved” problem, at least an “addressed” one. Heck, my garage door has been using rolling codes since 2005. There are a myriad of other examples of security controls that don’t hold up to even passing scrutiny (building key cards that are easily cloneable because they in effect only transmit a static identifier I’m looking at you). This observation in many different areas over the years has led me to the conclusion that the vast majority of people are either good (or maybe just lazy?), because good lord is some of this so incredibly easy to defeat. I do completely agree that the threat of attacks that require physical presence are orders of magnitude less likely than network-based ones, especially against buildings that are likely to have secondary deterrent controls like camera monitoring and after hour alarms. Where I see some increased risk here though is a combination of the ease of the attack with simple equipment, that it works within the same range as the key fobs themselves work meaning it could be conducted across a decent size parking lot, and the potential wide attack surface that affects many models of Honda cars across several years. I have no idea what the total number of individual cars that is, but I can easily guess it to be in the tens to hundreds of thousands. I do tend to be paranoid about these things though, and that definitely colors my thinking.
"Calling out Honda’s refusal or even inability to fix the issue was based on their own comments that they’ve essentially filed this bug report as “won’t fix”. It’s spot on that it may be unfair if it’s a somewhat common implementation, but I think “but everyone else does it too” leads us to well, *gestures broadly at everything in breach news*. I especially feel this way about this particular situation because it’s a well known and easily implemented mitigation with rolling codes, and the researchers found that the issue still affects models as late as 2020. The larger issue I was trying to get at and probably didn’t do near a good enough job was that it’s not just Honda, but tons of other manufacturers across many industries, especially smart devices, either refuse to correct security issues in legacy products or worse have no mechanism to at all. With software that’s not such a big deal as you can implement your own mitigations or likely find an easy to adopt alternative, but when that spills over to cars, appliances, or devices that are otherwise working completely well, you’re stuck with the tradeoff of going at risk or having to throw it away. I really don’t like that this is the situation for many things now and I think it’s an issue that’s going to explode in the coming years as the trend is that everything just has to be “smart” these days whether I personally think it makes sense or not."