spot_img
5.2 C
London
HomeBusiness NewsBad News For Elon Musk and Tesla: If You Do That With...

Bad News For Elon Musk and Tesla: If You Do That With Your Tesla, You Can Be Banned From Traffic

A man Bhavesh Patel, has been banned from driving as he set autopilot of his Tesla and then get to the passenger seat in United Kingdom.

A man who switched on his car’s autopilot before moving to the passenger seat while travelling along a motorway has been banned from driving for 18 months.

Bhavesh Patel, aged 39, of Alfreton Road, Nottingham, pleaded guilty to dangerous driving at St Albans Crown Court on Friday, April 20.

The court heard that at 7.40pm on May 21, 2017, Patel was driving his white Tesla S 60 along the northbound carriageway of the M1, between junctions 8 and 9 near Hemel Hempstead.

Autopilot, Tesla’s autonomous driving feature, aims to assist drivers on highways through a system of cameras, sensors, and GPS.

With important advances in radar, cameras, and GPS, there has been an explosion in research and development of autonomous car technology. Driverless vehicles—which will include fleets of trucks, shuttles, and sharing economy services like Uber—are set to shake up the driving world for businesses and professionals. They are also expected to substantially lower accidents on the road—one report predicts that accidents will drop by 80% by 2040.

So what is a “driverless” vehicle? And what is “autonomous driving technology?” A deeper look at Tesla’s Autopilot provides insight into the bigger picture of driverless car research. Autopilot does not turn a Tesla into a driverless car. It is Tesla’s autonomous driving feature that aims to assist drivers on highways. Autopilot-enabled vehicles can automatically steer, change lanes, and apply brakes—but still require a human behind the wheel.

Executive summary

What is Tesla’s Autopilot?

Autopilot is an add-on feature for Tesla Model S and X announced in October 2014, which is when cars in production were first enabled with Autopilot technology. It is meant to assist in highway driving. The technology uses a combination of radar, cameras, and GPS. When engaged, Autopilot-enabled vehicles can self-steer, adjust speed, detect nearby obstacles, apply brakes, and park. Enhanced Autopilot, or Autopilot 8.1, applies to all Teslas made after October 19, 2016.

Why does Tesla’s Autopilot matter?

Autopilot is one of the most advanced autonomous driving features that is currently available to real drivers on the road. Tesla’s philosophy of releasing updated technology incrementally became controversial following the May 2016 fatality that happened in a car with Autopilot enabled. What happens with regulations for Tesla Autopilot will have implications for all driverless car research and development in the future.
Who does Tesla’s Autopilot affect? There are, as of July 2016, more than 70,000 drivers using Autopilot. These drivers have driven 780 million miles in Autopilot-enabled Teslas. But, lessons from the technology will impact regulations and the development of driverless vehicles in the future—and thus, any driver or passenger of these vehicles.

When was Tesla’s Autopilot released?

Autopilot, currently in beta mode as of August 2017, was released in October 2015, and is updated whenever new technology becomes available. Tesla owners who have Autopilot can download updates over-the-air to their car.
How can I take advantage of Tesla’s Autopilot? Tesla Model S and X owners with cars made in 2014 or later are able to purchase Autopilot as an add-on feature. New Tesla buyers can select it as an option at the time of purchase.

Will Tesla’s Self-Driving Cars Be Legal?

The big question of course is who is responsible for accidents when autopilot fails—the car company? The driver? Or the system?

Regulating Self-Driving Cars

If you’ve been following along the autonomous car debate, you know that there’s a lot involved, mainly centered around safety issues. Back in July a Tesla operating in self-driving mode killed a driver when it failed to see the white side of a tractor trailer. Another driver was killed in a Tesla operating in autonomous mode back in May, as well. Since then there have been an increasing number of reports of accidents from around the world that owners and regulators alike argue were the failure of both drivers and Tesla’s autopilot system. Government regulators are stepping in and starting to have serious discussions (and open up investigations into the crashes) about the future of self-driving vehicles and what it could mean for everyone on the road.

First, to understand the debate you need to have some context. There are currently 10 states that have self-driving laws in place. They include Nevada (the first state to allow self-driving cars back in 2011), California, Utah, Arizona, Louisiana, Florida, Tennessee, Michigan, North Dakota and Washington, D.C. According to the National Conference of State Legislatures, at least 34 states have considered autonomous legislation in some form or another.

Elon Musk, the head of Tesla, has said that the new cars will be capable of what’s known as Level 5 autonomy. The National Highway Traffic Safety Administration determines the various levels of autonomy. They start at 0 and go to level 5. Think of level 0 as you would old cars—the driver controls it all. Level 1-3 are essentially cars we have on the roads today. They have safety features like antilock brakes, lane keeping assistants, parking helpers and sensors to keep us from backing into things. Level 4 and 5 are fully autonomous vehicles—one where you can put in a destination and the car drives the entire trip without input from the driver.

Back in September, after more Tesla autopilot crashes came to light, NHTSA announced it was going to investigate the crashes and see what, if anything it could enact to make the cars safer. They also updated their policy for HAVs or Highly Autonomous Vehicles, like Teslas with autopilot. The update includes 15 best practices for manufacturers regarding safe pre-deployment procedures for the technology and testing of the systems before making them available to the public.

Who Is Responsible For Self-Driving Related Accidents?

So why would Tesla decide to include this kind of hardware when it would be illegal to use it in much of the country? Well as one analyst from Edmunds pointed out in a Reuters story, it’s really more of a “vanity purchase” than anything else. On the other hand Tesla’s stock has been rather battered recently and Elon Musk, a man not known for keeping his cool, is anxious to bolster the performance and prove doubters wrong.

The big question of course is who is responsible for accidents when autopilot fails—the car company? The driver? Or the system? This week Elon Musk said that Tesla would only take responsibility for instances when the autopilot system failed. That’s a very narrow definition of responsibility but one that makes sense from a company’s standpoint. Musk said that it would be up to the insurance companies to determine who was truly responsible for the problem—but he did acquiesce that if there was a problem with the system or the car, Tesla would take responsibility for it, according to a story over at BGR.

Who is Responsible For The Crash?

The revelation last month that a fatal car cash involved Tesla’s “Autopilot” feature has sparked a debate over liability when it comes to assisted driving: Who’s legally at fault in a crash if a car is being somewhat controlled by a computer?

It’s still unclear if the fatal Florida crash will go to court, but it certainly won’t be the last of Tesla’s woes. Tesla told The New York Times that they have no plans to disable the feature, and that Autopilot is currently enabled in 700,000 cars.

California’s proposed rules for driverless vehicles take aim at Tesla

California regulators have proposed banning the word “autopilot” from electric-car pioneer Tesla’s advertising, teeing up an extraordinary conflict between state officials and the Silicon Valley powerhouse in an era of increasingly automated cars.

The move, in draft state regulations on autonomous cars released late Friday, marks an affront to Tesla’s ambitions and self-image. And it represents a muscular exertion of state power as state and federal officials are working through how they should govern vehicles that soon may require no human driver at all. Some of the state proposals are in line with the wishes of manufacturers; others, far from it.

“Autopilot” is the name for a set of semiautonomous Tesla features that fall far short of the driverless future that tech titans and U.S. officials have embraced as a business and safety opportunity. The company, led by Elon Musk, has argued that the sensors and software that enable those capabilities are part of a fast-evolving system that, working with today’s drivers, has already crossed the “ ‘better-than-human’ threshold.”

Disclaimer: The information on this site is provided for discussion purposes only, and should not be misconstrued as investment advice. Under no circumstances does this information represent a recommendation to buy or sell securities.
spot_img

latest articles

explore more