I have no confidence that Tesla will fix this before the planned Robo-Taxi rollout in Austin in 2 weeks.

After all, they haven’t fixed it in the last 9 years that self-driving Teslas have been on the road.

  • Otter@lemmy.ca
    link
    fedilink
    English
    arrow-up
    75
    arrow-down
    1
    ·
    edit-2
    4 days ago

    I believe Waymo has a better set of sensors (Lidar + Radar+ Cameras instead of just cameras), more processing power, and more research / time / resources spent on it compared to Tesla.

    So it’s not that we aren’t ready for self driving taxis, but rather about which cars are ready to provide that service

    • ShittyBeatlesFCPres@lemmy.world
      link
      fedilink
      English
      arrow-up
      44
      ·
      4 days ago

      I think Waymo is also trying to prioritize safety. I was in San Francisco recently and took one, just out of curiosity, from my hotel to a Giants game. It seemed to stop when pedestrian traffic got heavy instead of going all the way to the stadium. So, like three blocks from the stadium. No biggie. I might have told a human taxi driver I could walk from there.

      I’m not sure if it’s a California regulation or Waymo trying to play it safe but I will never get in a self-driving car regulated by Texas and designed to the specifications of one of history’s biggest dumbasses.

      • fluxion@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        4 days ago

        Working with cities to regulate self-driving and plan out specific routes/infrastructure was always going to be the only path to widespread adoption but Elon was too busy grifting off bullshit claims like everyone’s Teslas moonlighting as self-driving taxis and paying for themselves.

      • BrianTheeBiscuiteer@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        3 days ago

        That’s part of the reason Teslas are not well-suited for this. One camera, each direction, with no other sensors to help make decisions, is a really bad way to ensure safety.

        Humans normally have two “front facing cameras” (i.e. two eyes) so we have depth perception. We also process light differently than cameras do so infrared light (for one) doesn’t affect our decisions. We also have ears so the sound of a loud motorcycle engine tips us off if we just see a spec in the distance. We also use context clues to help our decisions, like if other drivers change lanes quickly we are extra observant of road obstacles.

        Not that technology can never be as good as a human at driving, but we use a lot more than a single “moving picture” to decide what we should do.

        • IphtashuFitz@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          3 days ago

          To be fair, the Tesla vision system has 3 cameras facing forward. One in the center above the front bumper grille and two behind the rear view mirror. Those two provide some level of stereoscopic vision to help judge distances.

          But yeah, the lack of other sensors is a huge issue. Anything from bug splatter to mud to snow etc. can easily obscure one or more cameras and render the whole vision system unreliable.

          We also process light differently than cameras do

          To expand on this a little further, human vision has also developed the ability to filter out unnecessary information in order to avoid overloading the brain. When tracking moving objects the eyes mostly send deltas of the movement to the brain. Computers, however, are the exact opposite. The cameras essentially send a series of still images, and it’s up to the computer to compare them to look for any movement.