It is strange seeing people claim Tesla is just doing PR spin here. They aren't saying this is the drivers fault because the driver decided to drive into the median. They are saying it is the driver's fault because he put too much faith into the autopilot system. Tesla's entire argument is that their autopilot is not as capable as some people believe. That is a weird stance to take if their only goal is to sell more cars.
>That is a weird stance to take if their only goal is to sell more cars.
That is only to weasel out of the legal issues. They are doing marketing stuff that pushes the feature as vastly more capable on the side..For ex Elon Musk tweeting videos that shows cars traveling by itself on unmarked roads..And having declarations such as "The human behind the wheel is only for legal reasons, the car is driving all be itself" and such..
So the actual marketing is based on this unrealistic projection, and only these "statements", which the public will soon forget, is based on its true capability..
Which supports my point. Theses responses likely have had a bigger influence from lawyers than PR folks. The goals of those two groups are almost diametrically opposed to each other. The best thing for Tesla from a legal perspective is to downplay the capabilities of autopilot. The best thing for Tesla from a marketing and PR perspective is to exaggerate the capabilities of autopilot. Tesla is doing the former in these responses. They are choosing the legal response over the PR response. It also should be noted this is also probably the best move from a public safety standpoint. I don't know why people are claiming these responses are PR spin or how these response could possibly sell more cars.
Regarding the rest of your post, I was not talking about Tesla's general marketing. I haven't seen Musk do anything like you claim. That is certainly misleading and irresponsible if he does do that and implies that any current Tesla can "drive all be itself"
>I was not talking about Tesla's general marketing. I haven't seen Musk do anything like you claim. That is certainly misleading and irresponsible if he does do that and implies that any current Tesla can "drive all be itself"
Please take a look at the very start of the video attached here [1]. It says "The person in the driver seat is only there for legal reasons. He is not doing anything. The car is driving itself".
Now, the next one [2], where Elon Musk retweeted a video of a Tesla navigating an unmarked road. There is a reddit discussion about the same here [3]
That first video is not a demonstration of the current autopilot feature. It is a demonstration of a future full self-driving feature. Those two features are purchased separately from Tesla and have their own multi-thousand dollar price tags. I think it is reasonable to expect an owner of the car to know the difference between the two while a general public wouldn't have any clue about the differences. It is also reasonable to expect Tesla to do make the difference clearer. Their responses to this accident are certainly doing that by making it clear that the driver should be attentive.
Both videos also show attentive drivers who are able to jump in at a moments notice. The first video is a promo video and the second is from a Tesla fan. I wouldn't be surprised if they were specifically edited to showcase the car's features. I can certainly understand how someone might view these videos and think that autopilot is flawless, but I don't think Tesla is doing anything unethical in those videos.
I doubt it's really about legal issues. Could they just disable or scale back the autopilot? Or at least warn people to not rely on it while avoiding any commentary about the facts of the accident. That'd be more polite/deferential to the investigators and to the victim.
I wouldn't be surprised if there's some fear over allowing uncertainty to fester for a year while awaiting the results of the investigation. But I think there's a much bigger risk in being perceived as unwilling to play by the same rules as everyone else.
>Could they just disable or scale back the autopilot?
The problem with this is that Tesla clearly believes that autopilot plus an attentive driver is safer than an attentive driver alone. I think that is most likely true.
It is possible that autopilot plus an inattentive driver is less safe than an attentive driver alone. I think this is plausible, but I don't think there is any real evidence to prove this one way or another.
The question then becomes does Tesla have an obligation to save the people in the second group from themselves and in turn put the people in the first group in greater danger?
Still doesn't explain why they couldn't just release a statement warning people not to rely on the autopilot until the authorities have completed their investigation.
That avoids prejudging whether it was primarily a defect with the vehicle, or driver error, or both.
Above all just respect the process. It's important to yield to an impartial entity when people have been hurt or killed.
> They are saying it is the driver's fault because he put too much faith into the autopilot system.
Any faith the driver had in Tesla's autopilot system would be due to Tesla's marketing. If Tesla believes its self driving capabilities are not quite ready for market, maybe they shouldn't be selling it and calling it an "autopilot".