It seems like a good decision then to limit self driving systems to situations where they are less likely to fail.
FSD is probably already safer driver than a human.
Even with the horrendous driving skills of some people, that’s a very bold claim without some actual evidence.
When it fails this generally means that it got stuck somewhere - not that it caused an accident. I haven’t seen the video in question but that probably was an older version or an autopilot, not FSD.
It doesn’t make that much difference what Tesla calls their latest beta software update imho. If their autopilot is enough to get you into dangerous situations, how is a system with even less human oversight going to be fundamentally different? I’ll need to see some more critical reviews of this system after years of not delivering on their claims and only rolling features out to select beta testers to maintain plausible deniability.
I didn’t find the specific video of older versions trying really hard to drive into oncoming traffic, though there are plenty. I found one of the FSD beta from 6 months ago though, where it can’t seem to decide which lane is correct.
It doesn’t make that much difference what Tesla calls their latest beta software update imho.
Autopilot and FSD Beta are two different systems of which autopilot is the less advanced one. There’s only one death ever linked to the use of FSD Beta and that includes the older versions aswell.
The only statistics available regarding the safety of FSD and autopilot are from Tesla itself which one should probably take with a grain of salt but they seem to idicate it being 5x safer than an average American driver.
Then there are ofcourse plenty of independent YouTubers doing videos of putting these systems to test such as AI DRIVR and CYBRLFT who give pretty honest assesments on the strenghts and weaknesess of them.
It seems like a good decision then to limit self driving systems to situations where they are less likely to fail.
Even with the horrendous driving skills of some people, that’s a very bold claim without some actual evidence.
It doesn’t make that much difference what Tesla calls their latest beta software update imho. If their autopilot is enough to get you into dangerous situations, how is a system with even less human oversight going to be fundamentally different? I’ll need to see some more critical reviews of this system after years of not delivering on their claims and only rolling features out to select beta testers to maintain plausible deniability.
I didn’t find the specific video of older versions trying really hard to drive into oncoming traffic, though there are plenty. I found one of the FSD beta from 6 months ago though, where it can’t seem to decide which lane is correct.
Autopilot and FSD Beta are two different systems of which autopilot is the less advanced one. There’s only one death ever linked to the use of FSD Beta and that includes the older versions aswell.
The only statistics available regarding the safety of FSD and autopilot are from Tesla itself which one should probably take with a grain of salt but they seem to idicate it being 5x safer than an average American driver.
Then there are ofcourse plenty of independent YouTubers doing videos of putting these systems to test such as AI DRIVR and CYBRLFT who give pretty honest assesments on the strenghts and weaknesess of them.