• kingofras@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    The funny thing is that the YT comment section was almost immediately brigaded by Elmo suckups that complained the tester was using Autopilot and not the latest version of FSD, completely forgoing the point that the LiDar vehicle stopped despite any automation systems being turned on.

    This was a test of the emergency braking systems of both cars in as close to real world scenarios as possible. Whether you run over a child in heavy rain or dense fog should not be dependent on whether the driver remembered to turn certain safety systems on prior to departure.

    It’s a massive fail for Tesla, and it is the closest scientific proof that the Elmo is a rich daddy boy with shit for brains. Once you have that level of money and leverage, you simply can make the most stupid decisions ever, and keep falling upwards. Meanwhile real people are dying almost weekly in his vehicles, and reporters and news organisations are scared shitless to report on them because he can cause lots of pain with his imaginary cash. Not to mention no govt agency will investigate properly because he’s got root access to that too. All those deaths are entirely preventable.

    Keep burning them, keep protesting at the dealerships. This truly could be the first billionaire we bring down. Watch him activate the OTA kill switch in every Tesla before they go under too. He’s thát kind of baby. Ironically that would save a lot of lives.

  • vaguerant@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    The big headline is understandably that it crashes into a fake painted wall like a cartoon, but that’s not something that most drivers are likely to encounter on the road. The other two comparisons where lidar succeeded and cameras failed were the fog and rain tests, where the Tesla ran over a mannequin that was concealed from standard optical cameras by extreme weather conditions. Human eyes are obviously susceptible to the same conditions, but if the option is there, why not do better than human eyes?

    • tias@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      Human eyes are way better at this than any camera based self driving system. No self driving system is anywhere close to driving in Swedish winter with bad weather and no sun, yet us Swedes do it routinely by the millions every day.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      Human eyes are much better at picking up shapes from complex visual clutter than computers. Not to mention that human eyes have infinitely better dynamic range than cameras (every time you go into or out of a tunnel the camera is essentially blind for a couple of seconds until it readjusts), both mean that while you absolutely do need the cameras, you can’t just rely on them and say “oh well humans only use their eyes”, because the cameras and the computers are nowhere near human level capable.

  • Darkassassin07@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 month ago

    As much as I like Mark, He’s got some explaining to do.

    At 15:42 the center console is shown, and autopilot is disengaged before impact. It was also engaged at 39mph during the youtube cut, and he struck the wall at 42mph. (ie the car accelerated into the wall)

    Mark then posted the ‘raw footage’ on twitter. This also shows autopilot disengage before impact, but shows it was engaged at 42mph. This was a seprate take.

    /edit;

    Youtube, the first frames showing Autopilot being enabled: 39mph

    Twitter, the first frames showing autopilot being enabled: 42mph

    • nickwitha_k (he/him)@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      Mark then posted the ‘raw footage’ on twitter. This also shows autopilot disengage before impact, but shows it was engaged at 42mph. This was a seprate take.

      No. That’s by design. The “autopilot” is made to disengage when any likely collision is about to occur to try to reduce the likelihood of someone finding them liable for their system being unsafe.

      • wewbull@feddit.uk
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 month ago

        Not saying you’re wrong (because I’ve always found it suspicious how Tesla always seems to report that autopilot is disengaged for fatal accidents) but there’s probably some people asking themselves “how could it detect the wall to disengage itself?”.

        The image on the wall has a perspective baked into it so it will look right from a certain position. A distance from which the lines of the real road match perfectly with the lines of the road on the wall. As you get closer than this distance the illusion will start to break down. The object tracking software will say “There are things moving in ways I can’t predict. Something is wrong here. I give up. Hand control to driver”.

        Autopilot disengaged.

        (And it only noticed a fraction of a second before hitting it, yet Mark is very conscious of it. He’s screaming. )

        Sidenote: the same is true as you move further from the wall than the ideal distance. The illusion will break down in that way too. However, the effect is far more subtle when you’re too far away. After all, the wall is just a tiny bit of your view when you’re a long way away, but it’s your whole view when you’re just about to hit it.

        • Darkassassin07@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 month ago

          It would, but he explicitly says ‘without even a slight tap on the breaks’ in the youtube video.

          Then:

          Here is the raw footage of my Tesla going through the wall. Not sure why it disengages 17 frames before hitting the wall but my feet weren’t touching the brake or gas.

          - Mark Rober

          Twitter.

          • billwashere@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            1 month ago

            He did state that he hit the brakes. Just on the fog one, not the wall. 🤷

            But the fact that FSD disengages 17 frames before the crash also implies the software, along with the car, crashed 😂 I’d love to see the logs. I imagine the software got real confused real quick.

            • LeninOnAPrayer@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 month ago

              This is likely what happened. The software hit an invalid state that defaulted to disengaging the autopilot feature. Likely hit an impossible state as the camera could no longer piece the “wall” parts of the road with the actual road as it got too close.

              Likely an error condition occured that would happen in the same way if an object suddenly covered the camera while driving. It would be even more concerning if the autopilot DIDN’T disengage at that point.