Tesla Whistleblower Says 'Autopilot' System Is Not Safe Enough To Be Used On Public Roads
Tesla Whistleblower Says 'Autopilot' System Is Not Safe Enough To Be Used On Public Roads

Tesla Whistleblower Says 'Autopilot' System Is Not Safe Enough To Be Used On Public Roads

Tesla Whistleblower Says 'Autopilot' System Is Not Safe Enough To Be Used On Public Roads::"It affects all of us because we are essentially experiments in public roads."
I lost all trust in their 'Autopilot' the day I read Musk said (Paraphrasing) "All we need are cameras, there's no need for secondary/tertiary LIDAR or other expensive setups"
Like TFYM? No backups?? Or backups to the backups?? On a life fucking critical system?!
As much as I lost trust in his bullshittery a long time ago, his need to mention the cost of critical safety systems is what stuck out to me the most here. That's how you know the priorities are backwards.
Also, my robot vacuum has LiDAR. It’s not expensive relative to a car.
Skimping on cost is how disasters happen. Ask Richard Hammond. "Spared no expense" my ass, hire more than 2 programmers, you cheap fuck.
Edit: This was supposed to be a Jurassic Park reference, but my dumb ass mixed up John Hammond and Richard Hammond. That's what I get for watching Top Gear and reading at the same time.
The crazier and stupier shit was that part of his justification was that "people drive and they only have eyes. We should be able to do the same."
Its a stunningly idiotic justification, and yet here we are with millions of these "eyes only" teslas on the road.
That's terrifying for showing how little he understands about the problem he is attempting to solve.
Humans use up to four senses at times to accomplish the task of driving.
@mosiacmango
@cm0002
Reminds me of Mao not brushing his teeth, because tigers didn't brush theirs either.
Ah, but you see, his reasoning is that what if the camera and lidar disagree, then what? With only a camera based system, there is only one truth with no conflicts!
Like when the camera sees the broad side of a white truck as clear skies and slams right at it, there was never any conflict anywhere, everything went just as it was suppo... Wait, shit.
RIP Joshua Brown:
This (sensor fusion) is a valid issue in mobile robotics. Adding more sensors doesn't necessarily improve stability or reliability.
To be fair, humans have proven all you need are visual receptors to navigate properly.
To be fair, current computers / AI / whatever marketing name you call them aren't as good as human brains.
Visual receptors.... And 3-dimensional vision with all the required processing and decision making behind that based on the visual stimuli, lol.
Uhhhh…
…any level 4 car actually, according to the federal governments and all the agencies who regulate this stuff.
NAVYA, Volvo/Audi, Mercedes, magna, baidu, Waymo.
Tesla isn’t even trying to go past level 3 at this point.
A 2014 Infiniti can drive itself more safely on the highway than a Tesla. The key here is they didn't lie about the cars capabilities so they didn't encourage complacency.
In the city though, yeah you'll need to look at other level 4 cars.
All of them. The other automakers didn't fire their engineers during a hissy fit.
Bot to be a hard-on about it, but if the cameras hace any problem autopilot ejects gracefully and hands it over to the driver.
I aint no elon dicj rider, but I got FSD andd the radar would see manhole covers and freak the fuck out. It was annoying as hell and pissed my wife off. The optical depth estimation is now far more useful than the radar sensor.
Lidar has severe problems too. I've used it many times professionally for mapping spaces. Reflective surfaces fuck it up. It delivers bad data frequently.
Cameras will eventually be great! Really they already are, but they'll get orders of magnitude better. Yeah 4 years ago the ai failed to recognize a rectagle as a truck, but it aint done learning yet.
That driver really should have been paying attention. Thee car fucking tells you to all the time.
If a camera has a problem the whole system aborts.
In the future this will mean the car will pull over, but it''s, as it makes totally fucking clear, in beta. So for now it aborts and passes control to the human that is payong attention.
So I drive a tesla as well. Quite often I get the message that the camera is blocked by something (like sun, fog, heavy rain).
You can't have a reliable self driving system if that is the case.
Furthermore, isn't it technically possible to train the lidar and radar with Ai as well?
Gracefully? LMAO
You can come back when it gives at least 3 minutes warning time in advance, so that I can wake up, get my hands out of the woman, climb into the driver seat, find my glasses somewhere, look around where we are, and then I tell that effing autopilot that it's okay and it is allowed to disengage now!
Starting off with 3d data will always be better than inferring it. Go fire up Adobe after effects and do a 3d track and see how awful it is, now that same awful process drives your car.
The AI argument falls short too because that same AI will be better if it just starts off with mostly complete 3d data from lidar and sonar.
This is exactly the problem. If I'm driving, I need to be alert to the driving tasks and what's happening on the road.
If I'm not driving because I'm using autopilot, ... I still need to be alert to the driving tasks and what's happening on the road. It's all of the work with none of the fun of driving.
Fuck that. What I want is a robot chauffer, not a robot version of everyone's granddad who really shouldn't be driving anymore.
Just in time to slam you into an emergency vehicle at 80...but hey...autopilot wasn't on during the impact, not Musk's fault.
good thing regular cameras aren't affected by reflective surfaces
oh wait