Regulators say Tesla Autopilot to take partial blame for fatal 2016 crash

Investigators have suggested that Tesla’s auto-driving system is to take some of the blame for a fatal 2016 crash. They have recommended that the ‘Tesla Autopilot’ system be declared as a contributing factor in the tragic incident because it allowed the driver to go for long periods of time without looking at the road or taking control with the steering.

This news may serve as a stumbling block with the U.K. set to allow for autonomous vehicles to be on the roads in the next few years. Are they safe? What are the dangers? Who is liable if things go wrong? These are questions that need answering before such technology is allowed on our very busy roads.

About the incident

The accident reportedly happened on 7th May 2016 when a Tesla fanatic was driving a Model S on the highway and had activated the autopilot system. In placing his trust in the system, Brown didn’t have his hands on the steering wheel.

Unfortunately, the system failed to distinguish between the brightly-lit sky and a large white truck crossing the highway. The bottom of the truck’s trailer impacted the wind-shield of the Model S vehicle, fatally injuring the driver. The incident has thought to have besmirched Tesla’s campaign and raises more doubt over the use of such technology on the roads.

The investigation findings

The board’s findings include the system allowing Brown to essentially let the car drive itself, even though the manufacturer had warned against it.

Reports blamed the fatal crash on several things:

  • The white truck’s failure to yield;
  • Brown not looking or reacting to the imminent collision;
  • The self-driving system allowing Brown not to pay attention or react;
  • The self-driving system failing to identify the truck and swerving.

The findings and recommendations may influence the future of the technology, as well as the laws surrounding the manufacture and use of self-driving vehicles. Lawmakers are reportedly having a headache trying to legislate them, which isn’t surprising.

Sought-after tech

Other car companies and even technology corporations like Google are investing billions into developing their own self-driving cars. The U.K. is expected to see completely self-driving vehicles on the roads by 2021 with our government recently approving certain testing of autonomous trucks on our motorways in a bid to create a more efficient delivery system for goods.

Along with the exponential growth in technology, putting self-driving vehicles on the roads is not entirely surprising, but still concerns a lot of people. Critics, including former Top Gear presenter Jeremy Clarkson, say the four-year timeline is much too quick and there is still a lot of work to do. Clarkson noted that he recently tested a self-driving car and made a couple of mistakes that could have killed him in just 50 miles.

Chancellor Phillip Hammond said the driverless vehicles in 2021 won’t even have a safety attendant on board. He explained the bold move:

“We have to embrace these technologies if we want the UK to lead the next industrial revolution.”

That’s all very well, but perhaps we should check our laces are tied before we rush off on a race? If Brown’s death is to teach us anything, it’s that there will always be software bugs and glitches that we need to identify and fix before we put human lives on the line.

The content of this post/page was considered accurate at the time of the original posting and/or at the time of any posted revision. The content of this page may, therefore, be out of date. The information contained within this page does not constitute legal advice. Any reliance you place on the information contained within this page is done so at your own risk.

Request a call back from our team

Fill out our quick call back form below and we'll contact you when you're ready to talk to us.
All fields are required.

Related Post

This website uses cookies.