Sunday, April 23, 2017

A few Tesla owners filed a class-action lawsuit over the rollout of Tesla Autopilot 2.0 [Updated]

A few Tesla owners filed a class-action lawsuit over the rollout of Tesla Autopilot 2.0 [Updated]

Last month, we reported on Hagens Berman, one of the law firms leading a class action lawsuit against VW and Mercedes for the emissions-cheating software, attempting to start a class action against Tesla over the claims made for Autopilot 2.0 features: Enhanced Autopilot and Full Self-Driving capabilities.
They have now officially filed the class action led by 3 Tesla owners.

In the actual class-action, they seem to have focused on the ‘Enhanced Autopilot’ feature instead of the self-driving feature, which, as we pointed out when they were seeking participants for the suit, they didn’t seem to understand.
As for ‘Enhanced Autopilot’, they are calling it “essentially unusable and demonstrably dangerous”. Steve Berman, managing partner of Hagens Berman, which represents the plaintiffs, said:
“Tesla has endangered the lives of tens of thousands of Tesla owners across the country, and induced them to pay many thousands of dollars for a product that Tesla has not effectively designed. Tesla sold these vehicles as the safest sedan on the road. What consumers received were cars without standard safety enhancements featured by cars costing less than half the price of a new Tesla, and a purported ‘Enhanced Autopilot’ that operates in an erratic and dangerous manner.”
He continued by saying that “to this day, Tesla has not released truly functional software for its Standard Safety Features or Enhanced Autopilot.”
While the suit states correctly that Tesla missed a few deadlines with bringing AP2 cars to parity with the first generation vehicles, the notes for the Enhanced Autopilot option clearly reads that it is “dependent on extensive software validation and regulatory approval”.
Furthermore, the lawsuit incorrectly describes the current features of the Autopilot on vehicles with second generation hardware as only having “a dangerously defective Traffic Aware Cruise Control” and “the remaining features simply do not exist.”
Of course, that’s inaccurate. Tesla’s ‘Summon’ feature has been released on Autopilot 2.0, as well as several updated versions of Autosteer and now even Auto Lane Change. They should actually know that since I mention it in an article that they are using as a reference in their own lawsuit.
The suit is not only seeking for Tesla to buy back the vehicles, but they also want damages for “the conduct of Tesla related to the defective Standard Safety Features and Enhanced Autopilot” and what they describe as “Tesla’s knowing fraud that garnered it illicit profits for a product suite that does not exist and put drivers at risk.”
We asked Tesla for a comment on the class-action, but we didn’t get an answer.
Update: a Tesla spokesperson sent us the following statement:
This lawsuit is a disingenuous attempt to secure attorney’s fees posing as a legitimate legal action, which is evidenced by the fact that the suit misrepresents many facts. Many of the features this suit claims are “unavailable” are in fact available, with more updates coming every month. We have always been transparent about the fact that Enhanced Autopilot software is a product that would roll out incrementally over time, and that features would continue to be introduced as validation is completed, subject to regulatory approval. Furthermore, we have never claimed our vehicles already have functional “full self-driving capability”, as our website has stated in plain English for all potential customers that “it is not possible to know exactly when each element of the functionality described above will be available, as this is highly dependent on local regulatory approval.”  The inaccurate and sensationalistic view of our technology put forth by this group is exactly the kind of misinformation that threatens to harm consumer safety.
Here’s the suit in full if you want to read:

Michael DeKort • April 20, 2017

It's unfortunate that Elon Musk's ego, which drove him to doing some amazing things. has now driven him to be a dangerous, unethical
Regarding regulations. When NASA first reviewed his SpaceX code they rejected it for not being properly tested, especially negative
testing, and for not doing nearly enough exception handling. That happened because NASA unlike Commercial IT actually uses best
engineering practices. They were the regulators. in this case NHTSA has punted. They are in over their heads just as much as the
Commercial IT folks who are making AP. These folks drank way too much of their own bathwater. They are way to impressed with their
skills of making apps, games and websites. Just look at the folks in charge across many of these AP companies. They came from
PayPal, Twitter, Swift, travel websites etc. It's a massive echo chamber.
In addition RAND, MIT and Mobileye have all come out recently and said AI is valuable but way over estimated. The folks who use it do
not really know how it works. Corner cases are not found and it would take hundreds of billions of miles of driving to stumble on all the
data they need. These engineers are using machine learning to be experts for them. Since they have almost no background in this domain
or in actual best practices they have no choice. What really should be happening is AI is mixed with proper systems engineering, other
existing data sources and simulation to create a Scenario Matrix. A tool that would ensure due diligence is done and everyone is on the
same page and handles scenarios the same way. What happens if a Ford AP owner buys a Tesla? Are all the same scenarios handled?
Are they handled the same way? Does a difference entice the driver to do or not do something they shouldn't be because of an
expectation from the previous car.
Just in case the echo chamber of folks who only know what they know from the press and have no experience in any of this chime in that
I am against AP. No I am for it. So much for it I want to see it happen ASAP. That ONLY happens if it is done in the right way. Tesla is not
doing it the right way. Putting people at risk needlessly, depending way too much on AI and not using actual best systems engineering
I provide more detail and cite links that support the comments I made above.
Lockheed Engineer/Whistleblower - Due Diligence Recommendations for Autonomous and Driverless Industry