Regulators have concluded their investigations into Tesla Model S cars, which was prompted by the death of a man named Joshua Brown in May of 2016. The driver was using the self-drive autopilot system when he collided with a truck.
The Regulators report apparently blames human error rather than the unfamiliar new technology…
While the self-drive mode can aid drivers, its purpose is not to take the place of a real driver. The report warns drivers that the autopilot system requires the “continual and full attention” of the driver, and even recommends that the driver keeps their hands on the wheels at all times; ready to take over if required.
The report also revealed that the deceased driver had 7 seconds to react to the danger but allegedly didn’t take the opportunity. Tesla’s log system reported that Brown reportedly didn’t use his brakes leading up to his death.
Reports subject to criticism
The non-profit advocacy group, Consumer Watchdog, slammed the decision to blame the driver rather than Tesla. They believe that the strong marketing of the word “autopilot” is misleading and implies that the vehicles literally drive themselves.
Reminiscent of the self-drive system “Naviton Autodrive System” featured in a cartoon Simpsons episode, “Maximum Homerdrive”, the main character falls asleep at the wheel but finds the truck driving itself at ease. This episode featuring the fictional device was first aired in 1999; so did it predict the future, or poke fun at unrealistic technology that shouldn’t exist?
U.S. Transportation Secretary Anthony Foxx recognises responsibility on both sides:
“…drivers have a duty to take their obligation (on the road) seriously and automakers must explain the limits of semi-autonomous systems.”
While Tesla was not held responsible for causing the driver’s death, they are encouraged to match their warning approach to their autopilot technology marketing efforts. Owning luxury cars with fantastically advertised new technology is possibly likely to persuade drivers to be less alert and more reliant on the technology.
Reports consider human factors for the cause of collisions, such as:
- Behavioural factors: over reliance of the technology;
- Travelling too fast;
- Mode confusion;
- Other distractions;
This is not the first crash related to Tesla’s self-driving technology. The system uses a camera to locate other cars and objects but in some incidents, the camera fails to distinguish between a brightly lit sky and a light coloured vehicle. The vehicle’s radar system could tune out rather than apply ‘unnecessary’ harsh braking.
Some of the current features of Tesla’s self-driving cars include:
- Automatic windshield wipers, activated when it senses rain;
- Automatic high beam headlights;
- Automatic emergency braking (AEB): Can automatically apply the brakes to avoid or reduce the severity of a crash;
- Side collision warning for when a collision from a vehicle adjacent to the car is imminent.
After the report findings were publicised, Tesla concluded that no recall was to be initiated. Even though the investigation has come to an end, regulators will continue to monitor autonomous driving technology.
The content of this post/page was considered accurate at the time of the original posting and/or at the time of any posted revision. The content of this page may, therefore, be out of date. The information contained within this page does not constitute legal advice. Any reliance you place on the information contained within this page is done so at your own risk.
Request a call back from our team
Fill out our quick call back form below and we'll contact you when you're ready to talk to us.
All fields are required.