It is said that there will be no car accident in the era of unmanned vehicles! Is this done?


This article is produced by NetEase Smart Studio (public number smartman 163). Focus on AI and read the next big era!

[Netease smart news December 20 news] In early November, a driverless bus and a cargo truck collided in Las Vegas. In the accident, no one was injured and no property was seriously damaged. But it has attracted the attention of many media and the public, partly because one of the cars is in an unmanned state. And within one hour before the accident, the bus was always running in driverless mode.

This is not the first accident involving unmanned vehicles. Other incidents include Uber's accidents in California in Arizona, Tesla in Florida, and several other companies in California. However, in almost all cases, it was due to human error, not driverless cars.

In Las Vegas, the driverless bus found a truck in front of the car that was reversing. He stopped and waited for it to leave the bus. However, the human driver of the truck did not see the bus and was always reversing. When the truck was approaching, the bus did not move (it did not move forward nor did it retreat), so the truck was fitted with the bus's front bumper.

As a researcher who has studied unmanned systems for more than 10 years, I have found that this incident raises many questions: why are buses not honking or avoiding the approaching truck? Why does it stop without moving to the safest place? If driverless cars make roads safer, the bigger question is: What should these vehicles do to reduce accidents?

In my lab, we are developing driverless cars and buses. We want to address fundamental security challenges: Even if driverless cars do all the things they should do, the drivers of nearby cars and trucks are still flawed, error-prone people.

How did the car accident happen?

There are two main reasons for a collision accident involving a driverless car: The first problem is that the sensor cannot detect the situation around the vehicle. Each sensor has its own unique features: GPS only has a clear view of the sky, the camera needs to work in well-lighted conditions, and the laser radar cannot function in fog, and ordinary radar is not particularly accurate. There may not be another sensor with different capabilities to take over. It is unclear how an ideal sensor group will be formed for driverless cars. And, limited by cost and computing power, the solution cannot just add more and more sensors.

The second major issue is when unmanned vehicles encounter situations that people who wrote the software didn't anticipate, just as truck drivers do not see buses and continue to rewind. Just like the human driver, the drone system must make hundreds of decisions every second and make adjustments based on new information from the environment. When a driverless car experiences something that has not been programmed to control, it usually stops or stops on the side of the road waiting for the situation to change. The Las Vegas bus may be waiting for the truck to move on or off, but the truck is getting closer. In such cases, the bus may not be set to be horned or reversed, or there may be no space to reverse.

The challenge for designers and programmers is to combine information from all sensors to create a computer-based model that accurately describes the space around the car. The software can then make interpretations to help the vehicle navigate and interact with anything that happens nearby. If the perception of the system is not good enough, then the car cannot make a good decision. The main reason causing Tesla’s fatal car accident was that the car’s sensors couldn’t tell the difference between the bright sky and the white big truck ahead.

If driverless cars can only meet human expectations of collision reduction, they are not enough to achieve safe driving. They must also be the "ultimate defensive driver." When nearby people make insecure moves, they need to be ready to react. A Uber crash in March 2017 in Tempe, Arizona is an example.

According to media reports, during the incident, drivers of the Honda CRV drove along a main road near the center of Tempe. She wants to turn left and go through three lanes. She can see that two roads in the three lanes are blocked and they are not moving. But she could not see the situation in the driveway farthest from her. In this lane, Uber's car is driving at a speed of 61 kilometers per hour in an area with a speed limit of 64 kilometers. The Honda driver decided to turn left and hit the Uber car at the crossroads.

A human driver in an Uber car approaching an intersection may think that the car will turn as it crosses the lane. If a driver in a Honda car notices this situation and slows down, it may be entirely possible to avoid a crash. A driverless car that is safer than humans would do the same thing, but Uber did not program it.

Improve testing

The Tampa crash and the recent traffic crash in Las Vegas are examples of people who are unable to determine the correct action even if they do not fully understand the situation. These vehicles comply with the regulations they are required to comply with, but they do not ensure that their decisions are the safest. This is mainly due to the way that most unmanned cars are tested.

Of course, the most basic criterion is whether the driverless vehicle can obey the traffic rules, obey the traffic lights and signs, understand the local laws concerning lane changes, or act like a law-abiding driver. But this is only the beginning. Before the driverless car really hits the road, they need to be programmed to tell them how to react when other vehicles do something irregular.

Testers need to treat other vehicles as rivals and plan for extreme situations. For example, if there are other trucks that are retrograde, what about driverless cars? At present, driverless cars may try to change lanes, but in the end may stop and wait for the situation to improve. Of course, no human driver would do this: People would take evasive action even if it meant breaking a rule, such as switching lanes without a signal, driving the car to the side of the road, or even speeding to avoid a crash. (From: Futurism Compilation: NetEase See Compilation Robot Review: Little)

Pay attention to NetEase smart public number (smartman163), obtain the latest report of artificial intelligence industry.

Non Standard Power Supplies

Non Standard Power Supplies,400W Server Power Supply,250W Non Standard Power Supply,180W Switching Power Supply

Boluo Xurong Electronics Co., Ltd. , https://www.greenleaf-pc.com

This entry was posted in on