Self-driving cars will make driving safer
self-driving car (autonomous car or driverless car)
Companies developing and/or testing autonomous cars include Audi, BMW, Ford, Google, General Motors, Tesla, Volkswagen and Volvo. Google's test involved a fleet of self-driving cars -- including Toyota Piri and an Audi TT -- navigating over 140,000 miles of California streets and highways.
How self-driving cars work
AI technologies power self-driving car systems. Developers of self-driving cars use vast amounts of data from image recognition systems, along with machine learning and neural networks, to build systems that can drive autonomously.
The neural networks identify patterns in the data, which is fed to the machine learning algorithms. That data includes images from cameras on self-driving cars from which the neural network learns to identify traffic lights, trees, curbs, pedestrians, street signs and other parts of any given driving environment.
For example, Google's self-driving car project, called Waymo, uses a mix of sensors, Lidar (light detection and ranging -- a technology similar to radar) and cameras and combines all of the data those systems generate to identify everything around the vehicle and predict what those objects might do next. This happens in fractions of a second. Maturity is important for these systems. The more the system drives, the more data it can incorporate into its deep learning algorithms, enabling it to make more nuanced driving choices.
The following outlines how Google Waymo vehicles work:
- The driver (or passenger) sets a destination. The car's software calculates a route.
- A rotating, roof-mounted Lidar sensor monitors a 60-meter range around the car and creates a dynamic three-dimensional (3D) map of the car's current environment.
- A sensor on the left rear wheel monitors sideways movement to detect the car's position relative to the 3D map.
- Radar systems in the front and rear bumpers calculate distances to obstacles.
- AI software in the car is connected to all the sensors and collects input from Google Street View and video cameras inside the car.
- The AI simulates human perceptual and decision-making processes using deep learning and controls actions in driver control systems, such as steering and brakes.
- The car's software consults Google Maps for advance notice of things like landmarks, traffic signs and lights.
- An override function is available to enable a human to take control of the vehicle.
Cars with self-driving features
Google's Waymo project is an example of a self-driving car that is almost entirely autonomous. It still requires a human driver to be present but only to override the system when necessary. It is not self-driving in the purest sense, but it can drive itself in ideal conditions. It has a high level of autonomy. Many of the cars available to consumers today have a lower level of autonomy but still have some self-driving features. The self-driving features that are available in many production cars as of 2019 include the following:
- Hands-free steering centers the car without the driver's hands on the wheel. The driver is still required to pay attention.
- Adaptive cruise control (ACC) down to a stop automatically maintains a selectable distance between the driver's car and the car in front.
- Lane-centering steering intervenes when the driver crosses lane markings by automatically nudging the vehicle toward the opposite lane marking.
Levels of autonomy in self-driving cars
The U.S. National Highway Traffic Safety Administration (NHTSA) lays out six levels of automation, beginning with Level 0, where humans do the driving, through driver assistance technologies up to fully autonomous cars. Here are the five levels that follow Level 0 automation:
- Level 1: An advanced driver assistance system (ADAS) aid the human driver with steering, braking or accelerating, though not simultaneously. An ADAS includes rearview cameras and features like a vibrating seat warning to alert drivers when they drift out of the traveling lane.
- Level 2: An ADAS that can steer and either brake or accelerate simultaneously while the driver remains fully aware behind the wheel and continues to act as the driver.
- Level 3: An automated driving system (ADS) can perform all driving tasks under certain circumstances, such as parking the car. In these circumstances, the human driver must be ready to retake control and is still required to be the main driver of the vehicle.
- Level 4: An ADS can perform all driving tasks and monitor the driving environment in certain circumstances. In those circumstances, the ADS is reliable enough that the human driver needn't pay attention.
- Level 5: The vehicle's ADS acts as a virtual chauffeur and does all the driving in all circumstances. The human occupants are passengers and are never expected to drive the vehicle.
Self-driving car safety and challenges
Autonomous cars must learn to identify countless objects in the vehicle's path, from branches and litter to animals and people. Other challenges on the road are tunnels that interfere with the Global Positioning System (GPS), construction projects that cause lane changes or complex decisions, like where to stop to allow emergency vehicles to pass.
The systems need to make instantaneous decisions on when to slow down, swerve or continue acceleration normally. This is a continuing challenge for developers, and there are reports of self-driving cars hesitating and swerving unnecessarily when objects are detected in or near the roadways.
This problem was evident in a fatal accident in March 2018, which involved an autonomous car operated by Uber. The company reported that the vehicle's software identified a pedestrian but deemed it a false positive and failed to swerve to avoid hitting her. This crash caused Toyota to temporarily cease its testing of self-driving cars on public roads, but its testing will continue elsewhere. The Toyota Research Institute is constructing a test facility on a 60-acre site in Michigan to further develop automated vehicle technology.
With crashes also comes the question of liability, and lawmakers have yet to define who is liable when an autonomous car is involved in an accident. There are also serious concerns that the software used to operate autonomous vehicles can be hacked, and automotive companies are working to address cybersecurity risks.
Carmakers are subject to Federal Motor Vehicle Safety Standards (FMVSS), and NHTSA reported that more work must be done for vehicles to meet those standards.
In China, carmakers and regulators are adopting a different strategy to meet standards and make self-driving cars an everyday reality. The Chinese government is beginning to redesign urban landscapes, policy and infrastructure to make the environment more self-driving car-friendly. This includes writing rules about how humans move around and recruiting mobile network operators to take on a portion of the processing required to give self-driving vehicles the data they need to navigate. "National Test Roads" would be implemented. The autocratic nature of the Chinese government makes this possible, which bypasses the litigious democracy that tests are funneled through in America.
Comments
Post a Comment