The cruise control on my Subaru probably wouldn't hit a fire truck since it has pre-collision braking.. but in any case, it's not sold as "Autopilot" or "Full Self Driving" so the limitations are pretty clear. There's a reason several countries stopped Tesla from marketing their nonsense like they do in the US - it turns out when you say things are self driving, consumers believe you and don't read the fine print.
Most TACC systems won’t brake for objects going less than vehicle_speed-threshold, due to how radar works. You don’t get relative speeds and distance for a set of objects, you get a spectrum showing how much of what the radar sees is moving at each relative speed.
That’s also why most TACC doesn’t work below 30km/h or so. It can’t tell the difference between the car in front and a stationary object next to the road.
Edit: This is actually an interesting rabbit hole to go down, because on the face of it, it seems like a fairly straightforward system to implement. Then the more you think about how a 'simple' system should handle various scenarios, the more you realise that to actually do it properly, 'just keep a good distance from the car in front' requires most of the perception required for general free-form driving.
This isn't true (on Subaru at least, likely due to their binocular eyesight cameras...) The primary purpose of their suite of driver aides is to prevent accidents, so it's actually really good at that. There's plenty of videos to attest to that but here's an example of a 60kph stop when the system detects a cone:
I guess you did say braking with TACC is complicated -- but I think that's a bit of an edge case that only applies to Tesla. The Subaru version of TACC is ~fine but it never gives you the impression it will handle 100% of the driving for you, so you'd never leave it to navigate around emergency vehicles by itself. Maybe the solution to rarely dangerous self-driving is to make them only ~75% good so driver's are always aware?
That's pretty cool! Much better than the radar based TACC on the Hilux I rented a couple of weeks ago. I came away from that one with a strong impression that it was inconsistent in a scary way and shouldn't have shipped like that.
Do you have one (or experience driving one)? If so:
- How does it go with phantom braking and various objects on the side of the road?
- Does it distinguish between various on-road things? (eg. does it brake for a dog? a plastic bag? a puff of smoke? an anvil falling off the back of the truck in front?)
- How good's its understanding of lanes (eg. a very slow car that's in another lane but currently directly in front of you due to the road curving)?
- Does it brake for objects outside the lane entering it (cross traffic, lane cut-ins, pedestrians, vegetation moving around in high winds)?
It is important to note that suddenly, and against all probability, a sperm whale had been called into existence, several miles above the surface of an alien planet. But since this is not a naturally tenable position for a whale, this innocent creature had very little time to come to terms with its identity.
1) Most consumers have absolutely no idea how airplane autopilot works. Pulling a “well actually” on this does not remove the issue of misleading marketing.
2) Yes, and the videos I’ve seen from those beta testers are terrifying.
No, I don’t think people believe that at all. A certain meme entered pop culture years ago that either the pilot is there for takeoff and landing, or the pilot is there in order to make the passengers feel safer. I certainly recall a lot of “planes these days basically fly themselves” articles. Neither of these are strictly correct about how autopilot works, but we are arguing about cultural understanding here.
Even taking the lesser of these two scenarios, this would imply that Tesla autopilot should be capable of doing most of the driving, with the driver just there for whatever bits you might argue are equivalent to a takeoff and landing. Given the stories of Teslas accelerating into stationary objects or totally losing track of the road lane, I don’t think even this limited understanding of what “autopilot” means is correct.
> "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself."
> “Full Self-Driving Capability All new Tesla cars have the hardware needed in the future for full self-driving in almost all circumstances. The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat. All you will need to do is get in and tell your car where to go. If you don’t say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar. Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed. When you arrive at your destination, simply step out at the entrance and your car will enter park seek mode, automatically search for a spot and park itself. A tap on your phone summons it back to you. The future use of these features without supervision is dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions. As these self-driving capabilities are introduced, your car will be continuously upgraded through over-the-air software updates.”
Yeah, Musk blatantly lying about these things really doesn’t help their case one bit. It’s hard to hide behind “autopilot is just marketing” when their popular CEO is lying about its capabilities left and right.
https://www.reuters.com/article/us-tesla-autopilot-germany/g...