"the hack requires physical access to the circuit board, removing and reinstalling it without damage, and soldering skills. Therefore, such an attack would not be very practical outside the laboratory ".
If it upgrades features, there will be a market for "jailbreak my Tesla".
But it costs all of $600! Do you really think someone who just dropped $40,000+ on a ridiculous status symbol is going to want to blow another six whole Benjamins to enable completely unnecessary and buggy features?
No idea where the 600$ come from, they did it with a laptop and a teensy. That article is pretty much garbage, referencing "spiegel", which itself is pretty clueless about this kind of stuff. This whole "elon mode" thing is their own clickbait spin, it's much less prominent in the original work.
I think everyone will have great luck insuring based on my experience of never having an insurance company ask me about after market mods.
Many years ago, I used to have a custom chip to increase hp, turbo and other tuning. I had my car serviced under warranty by just swapping the chip out before dealer visits.
I have suggested that (the big) IF we all have autonomous vehicles what would the market segmentation look like when performance is no longer a significant difference.
If the vehicles don't crash then impact resistance of the structure is not required, especially if "urban only" was a thing.
The absolute budget model of a plastic shell and bench seats.
To super luxury plush seating.
If the crash structures all remained what they are now...
Can you pay for "exciting mode" which maximizes braking and acceleration?
What about "preservation mode" which prioritizes the safety of the occupants over the safety of third parties in the event of an incident? You can bet certain people will have that made available.
"Defect mode" which cuts people up when merging.
"Overtaker" which attempts to overtake other vehicles rather thango at the average speed.
It seems pretty clear that if the vehicle becomes capable of doing real active driving then there should be the equivalent of a human driving test. Only the features that pass the test can go into the vehicle, so as per your example - an algorithm that cuts people up when merging would simply fail to pass the driving test and wouldn't be acceptable for deployment.
I think the likeliest case for full self driving is almost the opposite of what you're describing. Because it's all code and it's reproducible and testable the car company is liable if they put out something that speeds or drives dangerously. The result will be FSD cars all driving at the speed limit, leaving appropriate breaking distances etc. - driving much more conservatively than the equivalent human driver. It's also about incentives, you as the consumer might want a car that drives a hundred miles per hour and weaves in and out of traffic, but if the car company is the one facing the liability for it then it's not going to happen.
If it upgrades features, there will be a market for "jailbreak my Tesla".