In yet another Tesla Autopilot crash, Heather Lommatzsch filed suit against Tesla in a Utah state court for negligence, among other claims. She crashed her Tesla Model S into a fire engine in May 2018 while Autopilot was on and she was looking at her phone. Lommatzsch was under the erroneous impression that the car would safely stop on its own if another object, like a fire engine, was in its way. She broke her foot in the accident, but also claims she has lost "the pleasures and enjoyment of life and physical impairment."
Tesla Autopilot -- Unsafe at Any Speed?
In 2015, Tesla rolled out its Autopilot program, which, when engaged, automatically takes on all of the car's breaking, steering, and lane switching duties. But due to multiple Tesla crashes, including two that were fatal, the Autopilot program apparently does not always do its job adequately. Lommatzsch claims that Tesla was negligent in putting a defective product on the market that doesn't do what it claims to do: avoid accidents. Also, she is is suing for failing to warn Tesla owners that it doesn't, in fact, avoid accidents.
To prove defective product, the plaintiff will need to prove that Tesla's Autopilot doesn't drive the car safely. And this may very well be proveable. Though Tesla has repeatedly claimed that its Autopilot feature reduces crash rates by 40%, this may be fake news. According to NHTSA, there is very little scientific data to back that claim. Many other competitors have similar technology, yet they have not rushed to put their product on the market. Plaintiffs may dig deeper into this, and find that there is a viable reason why the competitive landscape for self-driving cars is so desolate.
Autopilot Not Synonymous With Autonomous
Regarding failure to warn, Tesla claims that it has repeatedly stated Autopilot does not shift blame for accidents from the driver to Tesla. In fact, Elon Musk has been quoted as saying "The onus is on the pilot to make sure the autopilot is doing the right thing ... We're not yet at the stage where you can go to sleep and wake up at your destination. We would have called it autonomous ... if that were the case."
But saying it doesn't necessarily make it so. Drivers quickly get into the habit of paying very little attention to the road, lulled into being a passive passenger rather than an active driver. And it's possible, according to other companies in the driverless car space like Google and Zoox, that Tesla should have seen this coming.
If you or someone you love, has been involved in a Tesla crash with the Autopilot engaged, contact a local personal injury attorney, who can review the facts of your case, and provide you with sound legal guidance and recommend best next steps.
Related Resources:
- Find a Local Personal Injury Attorney (FindLaw's Lawyer Directory)
- Tesla Model S Autopilots Itself Into a Parked Police Car (FindLaw Legal Grounds)
- Tesla Pushes Back Requests for Crash Data (FindLaw Technologist)
from Injured http://blogs.findlaw.com/injured/2018/09/tesla-sued-by-driver-over-autopilot-crash-broken-foot.html
No comments:
Post a Comment