Litigating autopilot products-liability cases against Tesla
Legal issues surrounding Tesla’s Autopilot systems are multifaceted, involving consumer protection, safety and liability, regulatory compliance, and driver responsibility
Tesla’s Autopilot is a driver-assistance system that is meant to reduce the overall workload for drivers. Tesla claims that its Autopilot system “enhances safety and convenience behind the wheel” and makes “driving safer and less stressful.” (https://www.tesla.com/support/autopilot) Tesla’s Autopilot was first introduced in 2015. It is classified as a Level 2 autonomous driving system by the Society of Automotive Engineers (“SAE”) International. The SAE J3016 standards define six levels of driving automation, from SAE Level Zero to SAE Level 5. As you climb up the levels, the capabilities of autonomous driving advance, allowing the car to self-drive everywhere under all conditions. Being defined as a Level 2 vehicle requires that the human in the driver’s seat is the actual driver while the driver-support features are engaged and the driver is responsible for constantly supervising the support features as needed to maintain safety.
Tesla’s Autopilot includes functionality and features that include Traffic-Aware Cruise Control (matches the speed of the Tesla to that of the surrounding traffic) and Autosteer (assists in steering within a clearly marked lane). Autopilot now comes standard on every new Tesla or can be purchased by Tesla owners who took delivery before the feature was standard.
Tesla’s controversially named Full Self-Driving system is different than Tesla’s Autopilot system. Tesla claims that Full Self-Driving allows the vehicle to “be able to drive itself almost anywhere with minimal driver intervention.” (https://www.tesla.com/support/autopilot) According to Tesla, the features and functionality of Full Self-Driving include those features available from Autopilot and the following.
Navigate on Autopilot: Actively guides your vehicle from a highway’s on-ramp to off-ramp, including suggesting lane changes, navigating interchanges, automatically engaging the turn signal and taking the correct exit.
Auto Lane Change: Assists in moving to an adjacent lane on the highway when Autosteer is engaged.
Autopark: Helps automatically parallel- or perpendicular-park your vehicle, with a single touch.
Summon: Moves your vehicle in and out of a tight space using the mobile app or key.
Smart Summon: Your vehicle will navigate more complex environments and parking spaces, maneuvering around objects as necessary to come find you in a parking lot.
Autosteer on city streets
Traffic and Stop Sign Control: Identifies stop signs and traffic lights and automatically slows your vehicle to a stop on approach.
Despite bearing the name “Full Self-Driving” (“FSD”), Tesla’s FSD is still classified as a Level 2 on the autonomous vehicle chart. When using Tesla’s FSD, Tesla claims the driver is still required to have constant supervision (eyes on the road) and be ready to take over the controls when necessary.
When Tesla’s Autopilot was first introduced in 2015, it was a breakout technology. But since that time, after numerous injuries and deaths are suspected to have been caused by Tesla’s Autopilot or Full Self-Driving, the automaker has been subjected to criticism and lawsuits to hold them accountable.
Arbitration clause
Tesla does not have a dealership network like all other automakers. Instead, it sells directly to consumers with on-line ordering and purchase agreements. Since 2017, Tesla has incorporated into its “Motor Vehicle Purchase Agreement” contract of sale, a provision called “Agreement to Arbitrate.” It purports to cover “any dispute arising out of or relating to any aspect of the relationship between you and Tesla and its affiliates” requiring such disputes to be subject to arbitration, rather than in the courts. While this provision may cover “claims related to statements about our [Tesla’s] product,” it likely does not preclude personal-injury claims in courts, especially where there is a third party involved who is not a party to the purchase contract or in a state where arbitration is not enforceable in a personal-injury or wrongful-death case. Tesla purchasers can opt out of arbitration, and preserve their right to a trial, by sending a letter to the company within a month of buying a car. Unfortunately, most consumers do not know to exercise that option. To date, Tesla has focused its forced arbitration efforts on mass-tort cases, employment cases and class actions rather than individual-incident products liability cases. (https://www.nytimes.com/2022/12/19/business/tesla/tesla- class-action-lawsuit-arbitration)
Venue
Before December 2, 2021, Tesla, Inc. was a Delaware corporation with its principal place of business in Santa Clara County, California. Since that date, Tesla’s principal place of business is now in Austin, Texas, where the civil liability laws are more favorable to manufacturers than in California. If filing an action against Tesla in state court, absent a local-party defendant, you will likely be removed by Tesla to federal court. If you have a local defendant, be aware of the problem of “snap removal.” (See Tracer Research Corp. v. Nat’l Envtl. Servs. Co. (9th Cir. 1994) 42 F.3d 1292.)
Theories of liability
The strict-liability theories against Tesla for injuries and deaths caused by the use of Autopilot or FSD are rooted in the defective design and failure to warn of these systems. The claims include that Tesla programmed Autopilot to allow it to be used on roadways that Tesla knew or should have known were not suitable for its use. Despite this knowledge, Tesla advertises Autopilot in a way that greatly exaggerates its capabilities and hides its deficiencies. Additionally, a partially automated system like Tesla’s Autopilot is fraught with problems such as a driver’s inability to safely regain control of their vehicle following an automation failure. (Eriksson, A., & Stanton, N. A., Takeover Time in Highly Automated Vehicles: Noncritical Transitions to and From Manual Control, Human Factors, June 2017, at 689-90.)
If the driver is led to believe that the automated system, which Tesla itself calls “Autopilot” and “Full Self-Driving,” provides greater control than traditional automated driver-assist systems, the driver can be lulled into behaving as though the vehicle is actually an SAE Level 3 or Level 4 autonomous system when in reality it is still only a Level 2. (T.W. Victor, E. Tivesten, P. Gustavsson, J. Johansson, F. Sangberg, M. Ljung Aust, 30 Automation expectation mismatch: Incorrect prediction despite eyes on threat and hands on wheel, 60 Human Factors, (8) (2018), at 1095-1116, 10.1177/ 0018720818788164 [“a key component of driver engagement is cognitive (understanding the need for action), rather than purely visual (looking at the threat), or having hands on wheel”]; R. Lin, L. Ma, W. Zhang, An interview study exploring tesla drivers’ behavioural adaptation, 72 Applied Ergonomics (2018), 37-47, 10.1016/j.apergo.2018.04.006.)
Tesla knowingly and falsely fosters the understanding that Tesla vehicles are capable of identifying and safely negotiating hazardous situations that the car simply cannot. (Is a self-driving car smarter than a seven-month-old?, The Economist, Science & Technology (Sept. 2, 2021 ed.), https://www.economist.com/science-and- technology/is-it-smarter-than-a-seven-month-old/21804141 [“Autonomous vehicles are getting better, but they still don’t understand the world in the way that a human being does. For a self-driving car, a bicycle that is momentarily hidden by a passing van is a bicycle that has ceased to exist”]; see also E.R. Teoh, What’s in a name? drivers’ perceptions of the use of five SAE level 2 driving automation systems, 72 J. Safety Res., 134-51 (2020), 10.1016/j.jsr.2019.11.005.)
National Highway Traffic Safety Administration investigations
On August 13, 2021, the National Highway Traffic Safety Administration (“NHTSA”) opened a formal investigation into Tesla’s Autopilot feature after identifying 11 crashes with Autopilot engaged since 2018 that involved Tesla vehicles crashing at first-responder sites. (National Highway Traffic Safety Administration [NHTSA] ODI Resume for Investigation number PE 21-020.) As part of their investigation of first-responder crashes, NHTSA requested that Tesla provide responses to various information requests and Tesla has sought to conceal their responses from the public. (USDOT Memorandum dated October 22, 2021, PE-21-020 Public File.) Despite the NHTSA investigation and mounting questions about the safety of Tesla’s autonomous driving software and marketing, Tesla announced in September 2021 it would be expanding the availability of its Full Self-Driving feature.
On December 12, 2023, NHTSA submitted its Part 573 Safety Recall Report number 23V-838, recalling virtually all Tesla vehicles made since 2012, potentially 2,031,220 vehicles. NHTSA described the Autopilot defect in part, saying, “In certain circumstances when Autosteer is engaged, the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse of the SAE Level 2 advanced driver-assistance feature.” ( NHTSA Part 573 Safety Recall Report number 23V-838.)
Despite the NHTSA recall, Tesla continues to allow drivers to use its Autopilot system in places where it is not designed to operate safely with insufficient protection from foreseeable driver misuse. Tesla’s recall only involved an over-the-air software update that purported to provide additional warnings when the Autopilot system senses driver inattention. (Faiz Siddiqui and Trisha Thadani, Recalling almost every Tesla in America won’t fix safety issues, experts say, The Washington Post, December 16, 2023, https://www.washingtonpost.com/technology/2023/12/16/tesla-autopilot-recall/)
On April 26, 2024, NHTSA announced it is investigating whether the December 2023 recall of Tesla’s Autopilot driving system did enough to make sure drivers pay attention to the road. NHTSA indicated that Tesla has reported 20 more crashes involving Autopilot since the recall. The crashes and agency tests raise concerns about the effectiveness of Tesla’s Autopilot recall.
Discovery
Tesla is at its heart a computer software company. As you may suspect, almost all discovery information will be in the form of electronically stored information (ESI). Therefore, you’ll need an ESI protocol and order entered to aid in getting Tesla to cough up the discoverable information in your case. This is because Tesla is evasive with respect to what it will produce with regards to ESI. Be aware of the types of records that have been produced in other litigation to best draft your discovery requests.
The EDR or “black box”
Two key ESI documents to obtain early in the litigation, even before suit is filed, if possible, are the Event Data Recorder [EDR] files and the “Log Files.”(See 49 C.F.R. 563 for regulations about the contents of the event data recorder.) The EDR a/k/a “black box” data is recorded when a vehicle is subject to a pre-determined acceleration or deceleration condition. It could be as simple as hitting a pothole or a curb (likely a “non-deployment event”) or being in a crash that triggers the airbags to fire (a “deployment event”).
The EDR file can be downloaded from the vehicle by any reputable reconstruction expert using a laptop, Tesla-supplied software (free), and a cable purchased from a third-party vendor specific to the vehicle model. The file is extracted from the airbag control module or the vehicle network and then uploaded to Tesla servers where it is interpreted and returned in the form of a report. But note that the data recorded in the EDR is only as good as the data sent to it via a set of unsecure network systems that collect information from throughout the vehicle and send it to the airbag control module where it is sampled. Errors have been demonstrated such as the use or non-use of a seat belt in the data files, so be sure that the data reflects the physical reality observed.
Log files and Tesla’s cloud servers
As mentioned, there are “log files” that are collected and stored in the vehicle. These files are far more granular and helpful in determining what the vehicle was doing at the time of a wreck or other event. The data is continuously recorded in Tesla’s cloud-based servers without need for an accident or other event, throughout the life of the vehicle. Tesla has stated that there are over 2,000 different data elements that are sampled and recorded, most with a time stamp down to the 1/1000th of a second (millisecond), although some (e.g., brake pedal application) are sampled only once per second.
You may note that this can be a massive amount of data. Tesla uploads the data to its servers from each vehicle periodically and loads it into a system where it can be analyzed using a proprietary application. The data also remains for a time on an SD card located in the Media Control Unit (MCU), which is the large display screen in the middle of the vehicle’s dashboard.
If an accident occurs and an airbag is deployed, the high-voltage battery safety link is opened, and the log files will no longer be uploaded to Tesla. To acquire the data pertinent to an accident of interest, you will likely have to have the SD card in the MCU removed, mirrorimaged, and sent to Tesla for interpretation. But frustratingly, Tesla has refused to produce a complete, interpreted set of log file data in every case. Instead, they pick and choose from the 2,000 data elements which ones they will produce in a native spreadsheet format. Often this is less than 100 different types of data, and they will not provide definitions of the data elements, only their computer software variable names or CAN network identifiers. Which ones they provide vary from case to case.
Owners of vehicles can use the Tesla app on their phones to request log files, which may be helpful if the data does get uploaded to the Tesla servers. Tesla has a computer program (“macro”) that will then collect certain data elements and put them into a spreadsheet and email them to the owner. In this case, over 200 different data elements are produced, and the data types are interpreted.
Tesla claims they have no catalog of all the elements, let alone a simple data dictionary, which is not in keeping with the standard of practice in software engineering.
Get the SD card
One more thing to obtain from the SD card in the MCU or from Tesla are video clips recorded at or shortly before an event. Tesla vehicles have from one to eight different cameras, depending on the version of hardware installed, with varying resolutions and frame-recording frequencies. It’s possible that your crash might be recorded and provide a great deal of information that we almost never have in other cases. Reconstruction experts experienced with Tesla accidents may be able to pull the videos off the SD card and convert them from their proprietary format into ones you can view on your computer.
Summary
Litigating against Tesla on behalf of someone injured or deceased by a defect in their Autopilot or Full Self-Driving systems takes several years and significant expense. The legal issues surrounding Tesla’s Autopilot systems are multifaceted, involving consumer protection, safety and liability, regulatory compliance, and driver responsibility. As Tesla’s driver-assistance systems are more widely used on our roadways, the legal landscape will likely continue to develop, with ongoing fights about how to access information under the exclusive control of Tesla and how best to hold them accountable for defects in their systems.
Elise Sanguinetti
Elise Sanguinetti is a founding partner at Arias Sanguinetti Wang & Torrijos LLP and is the immediate past-president of The American Association for Justice (AAJ) a national trial lawyers association. Ms. Sanguinetti’s main focus of expertise is serious injury, wrongful death cases, civil appeals and legal malpractice.
Don Slavik
Don Slavik lives in Steamboat Springs, Colorado, and is principal of the Slavik Law Firm, LLC, working with clients and firms around the country on products liability cases, class actions, antitrust, and other complex litigation. He was senior counsel to Robinson Calcagnie, Inc. of Newport Beach from January 2011 through May 2015 and is still of counsel to the firm. He is a graduate of the University of Wisconsin with a B.S. with honors in Nuclear Engineering and a J.D.
Copyright ©
2024
by the author.
For reprint permission, contact the publisher: Advocate Magazine