• snooggums@lemmy.world
    link
    fedilink
    English
    arrow-up
    180
    arrow-down
    1
    ·
    2 months ago

    Paraphrasing:

    “We only have the driver’s word they were in self driving mode…”

    “This isn’t the first time a Tesla has driven onto train tracks…”

    Since it isn’t the first time I’m gonna go ahead and believe the driver, thanks.

    • Pika@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      106
      arrow-down
      1
      ·
      2 months ago

      Furthermore, with the amount of telemetry that those cars have The company knows whether it was in self drive or not when it went onto the track. So the fact that they didn’t go public saying it wasn’t means that it was in self-drive mode and they want to save the PR face and liability.

      • IphtashuFitz@lemmy.world
        link
        fedilink
        English
        arrow-up
        68
        ·
        2 months ago

        I have a nephew that worked at Tesla as a software engineer for a couple years (he left about a year ago). I gave him the VIN to my Tesla and the amount of data he shared with me was crazy. He warned me that one of my brake lights was regularly logging errors. If their telemetry includes that sort of information then clearly they are logging a LOT of data.

          • Pika@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            18
            ·
            2 months ago

            Dude, in today’s world we’re lucky if they stop at the manufacturer. I know of a few insurances that have contracts through major dealers and they just automatically get the data that’s registered via the cars systems. That way they can make better decisions regarding people’s car insurance.

            Nowadays it’s a red flag if you join a car insurance and they don’t offer to give you a discount if you put something like drive pass on which logs you’re driving because it probably means that your car is already getting that data to them.

            • CmdrShepard49@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              1 month ago

              We just got back from a road trip in a friend’s '25 Tundra and it popped up a TPMS warning for a faulty sensor then minutes later he got a text from the dealership telling him about it and to bring it in for service.

      • catloaf@lemm.ee
        link
        fedilink
        English
        arrow-up
        36
        arrow-down
        1
        ·
        2 months ago

        I’ve heard they also like to disengage self-driving mode right before a collision.

        • sylver_dragon@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          15
          ·
          2 months ago

          That actually sounds like a reasonable response. Driving assist means that a human is supposed to be attentive to take control. If the system detects a situation where it’s unable to make a good decision, dumping that decision on the human in control seems like the closest they have to a “fail safe” option. Of course, there should probably also be an understanding that people are stupid and will almost certainly have stopped paying attention a long time ago. So, maybe a “human take the wheel” followed by a “slam the brakes” if no input is detected in 2-3 seconds. While an emergency stop isn’t always the right choice, it probably beats leaving a several ton metal object hurtling along uncontrolled in nearly every circumstance.

          • zaphod@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            40
            arrow-down
            2
            ·
            edit-2
            2 months ago

            That actually sounds like a reasonable response.

            If you give the driver enough time to act, which tesla doesn’t. They turn it off a second before impact and then claim it wasn’t in self-driving mode.

            • whotookkarl@lemmy.world
              link
              fedilink
              English
              arrow-up
              11
              arrow-down
              1
              ·
              2 months ago

              Not even a second, it’s sometimes less than 250-300ms. If I wasn’t already anticipating it to fail and disengage as it went though the 2-lane wide turn I would have gone straight into oncoming traffic

          • elucubra@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            2 months ago

            I don’t know if that is still the case, but many electronic stuff in the US had warnings, with pictures, like “don’t put it in the bath”, and the like .

            People are dumb, and you should take that into account.

        • GreenBottles@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          14
          ·
          edit-2
          2 months ago

          That sounds a lot more like a rumor to me… it would be extremely suspicious and would leave them open to GIGANTIC liability issues.

          • catloaf@lemm.ee
            link
            fedilink
            English
            arrow-up
            28
            ·
            2 months ago

            In the report, the NHTSA spotlights 16 separate crashes, each involving a Tesla vehicle plowing into stopped first responders and highway maintenance vehicles. In the crashes, it claims, records show that the self-driving feature had “aborted vehicle control less than one second prior to the first impact”

            https://futurism.com/tesla-nhtsa-autopilot-report

          • ayyy@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            3
            ·
            edit-2
            2 months ago

            How so? The human in the car is always ultimately responsible when using level 3 driver assists. Tesla does not have level 4/5 self-driving and therefore doesn’t have to assume any liability.

            • Pika@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              2 months ago

              This right here is another fault in regulation that eventually will catch up because Especially with level three where it’s primarily the vehicle driving and the driver just gives periodic input It’s not the driver that’s in control most of the time. It’s the vehicle so therefore It should not be the driver at fault

              Honestly, I think everything up to level two should be drivers at fault because those levels require a constant driver’s input. However, level three conditional driving and higher should be considered liability of the company unless the company can prove that the autonomous control, handed control back to the driver in a human-capable manner (i.e Not within the last second like Tesla currently does)

    • Mouselemming@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      1
      ·
      edit-2
      2 months ago

      Since the story has 3 separate incidents where “the driver let their Tesla turn left onto some railroad tracks” I’m going to posit:

      Teslas on self-drive mode will turn left onto railroad tracks unless forcibly restrained.

      Prove me wrong, Tesla

      • Tarquinn2049@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        Map data obtained and converted from other formats often ends up accidentally condensing labeling categories. One result is train tracks being categorized as generic roads instead of retaining their specific sub-heading. Another, unrelated to this, but common for people that play geo games is when forests and water areas end up being tagged as the wrong specific types.

        • Mouselemming@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          Aha. But that sounds correctable… So not having any people assigned to checking on railroads and making sure the system recognizes them as railroads would be due to miserliness on the part of Tesla then… And might also say something about why some Teslas have been known to drive into bodies of water (or children, but that’s probably a different instance of miserliness)

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 months ago

        I mean …… Tesla self driving allegedly did this three times in three years but we don’t yet have public data to verify that’s what happened nor do we in any way compare it to what human drivers do.

        Although one of the many ways I think I’m an above average driver (just like everyone else) is that people do a lot of stupid things at railroad crossings and I never would

        • Mouselemming@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 months ago

          I’m pretty sure Tesla self-drive does a lot of stupid things you never would, too. That’s why they want you at the wheel, paying attention and ready to correct it in an instant! (Which defeats the whole benefit of self-drive mode imho, but whatever)

          The fact that they can avoid all responsibilities and blame you for their errors is of course the other reason.

    • XeroxCool@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      9
      ·
      2 months ago

      The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.

      Should a manufacturer be held accountable for legitimate flaws? Absolutely. Should drivers be absolved without the facts just because we don’t like a company? I don’t think so. But if Tesla has proof fsd was off, we’ll know in a minute when they invade the driver’s privacy and release driving events

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        53
        arrow-down
        1
        ·
        2 months ago

        Tesla has constantly lied about their FSD for a decade. We don’t trust them because they are untrustworthy, not because we don’t like them.

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          2 months ago

          They promote it in ways that people sometimes trust it too much …. But in particular when releasing telemetry I do t remember tha ever being an accusation

          • ayyy@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 months ago

            It’s more about when they don’t release it/only selectively say things that make them look good and staying silent when they look bad.

      • aramis87@fedia.io
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        2 months ago

        How is a manufacturer going to be held responsible for their flaws when musk DOGE’d every single agency investigating his companies?

      • Lka1988@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        2 months ago

        The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.

        I owned an FJ80 Land Cruiser when that happened. I printed up a couple stickers for myself, and for a buddy who owned a Tacoma, that said “I’m not speeding, my pedal’s stuck!” (yes I’m aware the FJ80 was slow as dogshit, that didn’t stop me from speeding).

    • TheKingBee@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 months ago

      Maybe I’m missing something, but isn’t it trivial to take it out of their bullshit dangerous “FSD” mode and take control? How does a car go approximately 40-50 feet down the tracks without the driver noticing and stopping it?

      • snooggums@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        On some railroad crossings you might only need to go off the crossing to get stuck in the tracks and unable to back out. Trying to get out is another 30-40 feet.

        Being caught off guard when the car isn’t supposed to do that is how to get stuck in the first place. Yeah, terrible driver trusting shit technology.