I notice nobody has commented on the fact that the driver should've reacted to the deer. It's not Tesla's responsibility to emergency brake, even if that is a feature in the system. Drivers are responsible for their vehicle's movements at the end of the day.
Meh, the "full self driving" will shut off 0,00001 second before impact so they can say "The system was not active at the time of the impact". Easycheasy
A self-driving car, also known as a autonomous car (AC), driverless car, robotaxi, robotic car or robo-car,[1][2][3] is a car that is capable of operating with reduced or no human input.
But also
Organizations such as SAE have proposed terminology standards. However, most terms have no standard definition and are employed variously by vendors and others. Proposals to adopt aviation automation terminology for cars have not prevailed.
So there's no one definition. It is driving by itself. You don't have to do any driving. But you should keep alert so if something happens you can taker over. Seems like it fits with the general use imo but doesn't fulfill the more stringent definitions.
They're selling a spotty lane assist as Self Driving when it is not.
Other companies are selling actual self-driving cars, (even if those companies are fucking up as well) but Tesla is nowhere near that level of autonomy. All because Musk cheaped out on the sensor package.
Teslas will never be self-driving, because they literally cannot detect the road and obstacles with just their little camera setup.
They should not be allowed to call it self-driving, or autopilot, or anything else that implies that you can take your hands off the steering wheel.
True but if Tesla keeps acting like they're on the verge of an unsupervised, steering wheel-free system...this is more evidence that they're not. I doubt we'll see a cybercab with no controls for the next 10 years if the current tech is still ignoring large, highly predictable objects in the road.