Skip Navigation

AI driver: review of Mark Rober's LIDAR showcase

I think AI driver give a good summary of some of issues with the Mark Rober showcase of LIDAR.

  • Some of the bias Mark Rober friendships with the LIDAR supplier.
  • Not using Tesla full self driving
  • Being so nervous that he accidentally disables autopilot, twice.
  • The one sided view in LIDAR, with not show examples of LIDAR have issues.
  • Give a way forward to do the tests with FSD
  • Mark Rober not giving the test his full attention, fun vs scientific process
20 comments
  • The friendships with LIDAR suppliers aside (always a problem with these kinds of things),

    1. the full self driving wouldn't have saved the wall.
    2. He's telling a story. Disabling autopilot because he's nervous is good storytelling because it's science communication, not science. Its also cause he knows it's super likely to fail.
    3. He did show LIDAR having troubles - the heavy rain. Its just good enough to peak through it to stop the car before it hits the kid. it's just that the self driving ALSO has trouble with this, and in fact worse troubles. LIDAR has trouble in places where the light is going to be blocked, a situation that a pure camera solution on a Tesla is not going to solve. Not unlikely to solve, NOT GOING to solve. In the real world, if light can't penetrate through a thing twice, it's unlikely to only do it once.
    4. Its not supposed to be scientific, it's science communication. Science is testing and retesting. He did one test and called it good.

    Yup, LIDAR isn't a silver bullet for every situation to do sensing. But it's a damn sight better than pure cameras. And Musk would have known this if he was a good engineer. But hes not. Hes a spoiled, rich, apartheid-loving, racist asshole that thinks he's a good programmer and engineer.

    • Why does no one mention why they said they dropped RADAR: Who do you trust when vision conflicts with RADAR? There were constant problems on this point. LIDAR is another redundant and possibly conflicting input.

      • Yup, but that's going to be true in every environment. Conflicting or noisy signals are always going to be there when you have multiple sensors. Theres going to be conflicts between pure camera systems - what if a camera sensor goes buggy and starts putting out data that says there's always a thing to the left?

        More systems giving data to establish ground truth is better. Dont Boeing yourself into thinking that one sensor is good enough - that's how you kill people.

        Edit: you also know how they're doing the depth detection with cameras? With AI. You know, that thing that we keep having troubles hallucinating data with. So the data it's getting from the depth subsystem isnt ground truth, it's significantly worse and could be completely wrong.

      1. Until someone re do the test, we will not know.
      2. yes if you do it knowingly, then redo it without talking control.
      3. Yes and no, he did put it down as a win.
      4. Mark is an engineer and has a good history of using scientific process. He has previously shown that it can be combined.

      I know sadly that Tesla has lost a lot of respect due to Elon, especially over the last 4 years.

      Engineer is in large part about balancing cost vs features. Yes the will be cases where LIDAR is better, but it comes at high cost. Think about it what is break over point the where the 0.1% edge case where LIDAR will do better, justify the cost.

      If the test that Mark show to unexpected drivers I think many humans drivers will failed. Like spoting a dark stationary object in fog or heavy rain, is very difficult.

  • I don't think it was very good response. All the points he brings up do not change any of the outcomes. Fsr would likely still fail all these tests.

    Ive watched a lot of AI driver and I think fsr is cool but its lack of lidar is a huge mistake and creates these dangerous situations whete Tesla isn't smart enough to mimic the awareness of a real driver.

    • lack of lidar is a huge mistake and creates these dangerous situations whete Tesla isn't smart enough to mimic the awareness of a real driver.

      Can you give some examples?

20 comments