Are Cars Without Drivers Safe?

Self-driving cars without a driver are no longer just something from science fiction movies. These autonomous cars, also known under the name of Self-Driving cars in English, already circulate on the roads and, unfortunately, are causing traffic accidents .

However, many of the changes that will bring these types of vehicles will be positive. According to reports, driverless cars will reduce the number of accidents, for example, but they also complicate our definitions of “driver” and make it difficult to determine who is to blame for car accidents.

We are still a few years away from the mass production of autonomous automobiles, but these vehicles have allegedly caused two deaths. Laws and safety regulations related to driverless cars are still in process and it is clear that the legislation must be updated and updated according to what is coming.

 

Classification of autonomous cars

The Society of Automotive Engineers ( SAE international *, in English), classifies the autonomous driving capacity of a car into 6 levels. This classification can be used to establish the legislation of autonomous cars.

Level 0: The car does not have any type of automated system that can take control of the vehicle. This type of car may have systems that issue some warning, but nothing more. The driver has full control at the wheel.

Level 1: The cars that began to include cruise control systems, speed control , cruise control or cruise control (called cruise control , in English) adaptive or certain technology to keep the car in the lane.

Level 2: A car of this level helps you to drive, but you have to be alert and pay attention. These cars have more advanced automated systems to maintain their speed and stay in their lanes. But if problems arise, the human being has to take charge. Tesla’s “autopilot” cars are an example of a level 2 car.

Level 3: Cars of this level can make decisions on behalf of their driver. These cars use information from the environment to decide whether to change lanes and can pass. They may need emergency human intervention in situations of potential accidents. Audi already has a prototype of a level 3 car model.

Level 4: A level 4 car can handle each and every situation on its own within a highly controlled, restricted and perfectly mapped area that is safe from unexpected events such as severe weather, but is it possible that this exists? place?. It is up to the driver to activate or not the automated mode when it is safe to do so. Once the autonomous car is turned on, the driver is free to pay attention to what happens on the road.

Level 5: The level of these cars is completely autonomous. No human intervention is required for a car classified as level 5. This type of car will not have steering wheels or pedals. The idea is that the human being enters, sits down, tells the car where to go and has the possibility of watching movies, playing games or doing some work. The experience of traveling in these cars will be more similar to traveling on the subway or a train.

Achieving automation and autonomy of level 5 has become the new goal for companies like Google. As we said before, the manufacture of autonomous cars goes faster than the laws and it is evident that it is necessary to change several laws to allow the circulation of these vehicles through the cities of the country.

 

Pending: Legislation on autonomous cars

Currently, in the United States, only nine states have legislation related to driverless cars. The federal government proposed the first law called Federal Automated Vehicles Policy *. This policy begins to outline safety standards, with a list of 15 safety points that manufacturers must sign. In addition to this, the federal government will work with states to develop safety laws in connection with driverless vehicles. This commitment to road safety comes after the two deaths associated with the Tesla S autopilot model.

 

Fatal accident in Florida

In May 2016, in Florida, one of Tesla’s artificial intelligence cars claimed the life of its first victim in an accident involving a truck. The driver of a Tesla model S, Joshua Brown, 40, activated the system called in English autopilot , but a truck driver in front decided to make a maneuver to turn left, the Tesla did not brake and went under the trailer. Brown died from the injuries sustained in the accident. According to the driver of the truck driver , Brown was watching the Harry Potter movie while he had the autopilot activated and was driving very fast.

After the first case of a traffic accident with a car of this type, it became evident the legal vacuum and the lack of clarity about who should take charge of what happened.

 

First demand

In July 2016, Gao Jubin presented the first demand for the Tesla model S automatic pilot in Beijing. His son, Gao Yaning, 23, died after one of these cars crashed in January. The man filed the lawsuit against the manufacturer alleging that Tesla needs to be more cautious with the capabilities of the autopilot. During the trial, Jubin requested to investigate if the autopilot system was underway at the time of the fatal accident and asked the multinational to stop using the term “automatic driving”, because in his opinion, that is a misleading advertising .

The lawsuit asks for a modest sum to cover damages. It has not been confirmed whether the death of Gao Yaning was caused by the use of the autopilot of the model S. Independently, this is sure to be only the first of many demands that will be presented linked with this new technology of the automotive industry.

While driverless cars are aimed at becoming a mass product that will circulate through the streets of the city, traffic accidents with common cars continue to occur every day.