Cruise has also previously maintained that its record of driverless miles have outperformed comparable human drivers in terms of safety, notably crash rates.
But is this actually true? I hate that they just printed this without any attempt to verify it. Surely some independent body has looked into this by now.
It's not true, this is not the first time Cruise has been caught lying, and at some point an adult needs to step up and tell them to stop putting people in danger.
Even Waymo has commented on the past about Cruise playing fast and loose with the definitions of things that needed to be reported.
Waymo cars seem to operate much more sensibly than Cruise ones from what I've watched and read... although IMO that is mainly down to the car calling it quits much sooner and asking for an operator to take control, and driving in a different environment in general.
Cruise on the other hand seems to just carry on anyway, unless its lidar is blocked 😳
Yeah, I mean, some food for thought here is that Waymo started out as a research project and has been doing this since 2009 and they're ultra conservative with their behaviors. Before starting in 2009, the beginnings of the team were recruited from DARPA Grand Challenge participants. And even they have major mishaps.
Cruise, on the other hand, started out trying to sell retrofit hardware right away. Then tried convincing people they could do city driving right away. Now GM has revenue targets for them, like any adult business would, and they have no hope of ever accomplishing them. So, they're back to their old tricks, cutting down the number of miles driven for training models, rushing vehicles into service with no monitoring operators in them, deceiving investors and regulators about remote operations.
One is a slow, methodical money furnace that attempts to solve the larger problem set. The other is a fast moving money furnace that tries to get people to pay them for half measures.
Waymo's progress is probably a good indicator as to how far along we are with self driving cars IMO. Given that Waymo has their cars pretty thoroughly trained on set routes (well, even us humans need to learn or try various routes before we're fully confident on them sometimes), Cruise cheaping out on the whole training process is only going to accelerate their demise... especially when it's at the expense of pedestrians' safety
If you really want your mind, blown the first autonomous vehicle to drive coast to coast in the US happened in 1989. A vehicle from Carnegie millen University called NavLab. It used lidar, cameras, radar, and ultrasonics. Literally the same stuff we're using today.
Pretty sure it IS 1v1 factually true, but the real question is, "why?". Is it because everyone is weary around a car with a huge-ass camera and sensor system on top that doesn't have a driver? Or because the system is good?
I am sure it is true in at least some sense because they would be called on an outright lie but there are many ways you can deceive with true numbers. And I don’t trust them to be fully honest.
But if it is accurate I’d like to see an independent analysis rather than the company’s spin on it.
No Sale. We do not sell your personal data for money.
Our AVs are able to operate safely because, like human drivers, they are able to assess the driving environment with their senses - or rather, sensors - that continuously gather data about things happening all around the vehicle. We treat this sensor data with care, restricting access to and disclosure of this data to the purposes of providing and improving Cruise Services and user experiences, or for security, safety, development, research, and legal reasons.
Driverless cars are certainly less error-prone overall than human operated ones. Distraction, sleepiness, intoxication, hubris, and other common "human error" causes of accidents are eliminated. Now we're seeing, though, that human beings - even pretty average ones - are still able to make better judgments in unique situations.
Because the recent incidents have been so laughably stupid from a human perspective, the instinct is to doubt the accuracy of driverless cars in all situations. The robots are able to do the comparatively simple things extremely well. It's just the more complex things they still have trouble with - so far. They're still safer than human operators, and will only continue to get better.
Humans make the same mistakes though. Backup cameras were added to cars because humans kept running over people, especially kids. People block emergency vehicles all the time.
Yes, the automation will always have room for improvement, but the current 'newsworthy' incidents are rarely in the news when humans do the exact same thing.
I suspect there is something more to this than just that. After all, the car in question did this:
Earlier this month, a Cruise robotaxi notably ran over a pedestrian who had been hit by another vehicle driven by a human. The pedestrian became pinned under a tire of the Cruise vehicle after it came to a stop — and then was pulled for about 20 feet (six meters) as the car attempted to move off the road.
It seems like there are unsolvable safety problems going on.
Yes, the car does not appear to have safety features that let it know a body is caught underneath, but it did try to get out of traffic after the collision.
Or it is an opportunity to add some additional sensors underneath that will make it miles better than human drivers.
Really the main problem with autonomous cars at this point in time is a combination of the co panes hiding issues and the public expecting perfection. More transparency and a 3rd party comparison to human drivers would be the best way to both improve automation and gain public trust when they actually see how bad human drivers can be.
Also charge corporations for betatesting on the fucking public... they're using tax payer funded roads and putting our lives at risk for their profits. They should share those profits far, far more than they do.
You would think a self driving car could have 360 degrees of vision and not run into things, whether it's a firetruck or a cardboard box or a person. That should be job 1 for self driving.