This is how they claim autopilot is safer than human drivers. In reality Tesla has one of the highest fatality rates but magically all of those happen when autopilot was “off”
But tesla doesn’t claim you can ever not overlook the car, so if you didn’t notice and stop it, it is your fault. Fuck elon and all that, but it is somewhat reasonable.
Most states apply liability to whoever is in the driver seat anyway. If you are operating the vehicle, even if you’re not controlling it at that moment, you are expected to maintain safe operation.
That’s why the Uber self driving car that killed someone was considered the test driver’s fault and left Uber mostly off the hook.
I don’t know the specifics of how the law is implemented but the self driving levels are defined such that from SAE level 3 onward you may have a case against manufacturer. I haven’t kept up to date with Tesla SAE level but I imagine they’re still officially on level 2 because it lets them keep their hands clean.
Now, I would like to imagine the legal case of an accident involving a self driving robo-taxi transporting another robot to a facility (owned by the company).
Maybe they can blame the humans who suffered the accident?
Ha only if. Autopilot turns off right before a crash so that Tesla can claim it was off and blame it on the driver. Look it up.
The driver is always at blame, even if it was on. They turn it off for marketing claims.
PS: fuck elon
I didn’t know this, but I’m not shocked, or even a little bit surprised.
Mark Rober had a video on autopilot of several cars and he used his Tesla. The car turned off the autopilot when he crashed through a styrofaom wall.
This is how they claim autopilot is safer than human drivers. In reality Tesla has one of the highest fatality rates but magically all of those happen when autopilot was “off”
It turns it off with the parking sensor 2ft before the accident.
But tesla doesn’t claim you can ever not overlook the car, so if you didn’t notice and stop it, it is your fault. Fuck elon and all that, but it is somewhat reasonable.
Holy shit I did indeed look it up, and it’s true. Dunno if it’ll hold up but it’s still shady as shit
Most states apply liability to whoever is in the driver seat anyway. If you are operating the vehicle, even if you’re not controlling it at that moment, you are expected to maintain safe operation.
That’s why the Uber self driving car that killed someone was considered the test driver’s fault and left Uber mostly off the hook.
Not sure how it works for the robo taxis, though.
Yeah that’s gonna be tricky with those. I live in Vegas where they’re already operating. No steering wheel at all.
I don’t know the specifics of how the law is implemented but the self driving levels are defined such that from SAE level 3 onward you may have a case against manufacturer. I haven’t kept up to date with Tesla SAE level but I imagine they’re still officially on level 2 because it lets them keep their hands clean.
Well… What about blaming the passengers?
Now, I would like to imagine the legal case of an accident involving a self driving robo-taxi transporting another robot to a facility (owned by the company).
Maybe they can blame the humans who suffered the accident?