There has been a lot of publicity in recent months about driverless cars, but the practicalities are far from resolved. Fully automated cars don’t drink and drive, fall asleep at the wheel, text, talk on the phone or put on makeup behind the wheel. With their sensors and processors, they navigate roads without these human failings which all too often result in nasty accidents. But there is something self-driving cars do not cope with well – at present anyway – the unexpected. The human brain is still better than any computer at making decisions in the face of sudden, unforeseen events on the highway: a child running into the street, a swerving cyclist or a fallen tree limb.
Computer algorithms can ensure that self-driving cars obey the rules of the road – making them turn, stop, slow down when a traffic light turns yellow and resume when a light turns to green from red. But this technology is unable to control the behavior of other drivers. Autonomous vehicles will need to deal with drivers who speed, pass even when there’s a double yellow line and drive the wrong way up a one-way street. Not to mention the awesome problems created by drunk drivers careering all over the road suddenly and without any decipherable pattern.
One solution might be to equip cars with transponders that communicate their position, speed and direction to other vehicles. This is known as vehicle-to-vehicle or V2V communication, and it is similar to how airplanes avoid each other in the air. While promising, V2V is still early in development and it will be effective only when large numbers of vehicles with this capability are on the road. Moreover, the skies are a good deal less crowded than a typical city thoroughfare.
Snow, fog, rain and other types of weather make driving difficult for humans. It’s no different with driverless vehicles which stay in their lanes by using cameras that track lines on the pavement. But they can’t do that if the road in question has a coating of snow. Falling snow or rain can also make it difficult for laser sensors to identify obstacles. A large puddle caused by heavy rain may look like blacktop to an autonomous car’s sensors. Indeed, in reports to Google and others filed with the authorities in USA, weather was the prime cause of system failures after which, or during which, human drivers had to take back control. So much for the argument put forward by early devotees of driverless cars that the human driver could take a nap whilst his vehicle was doing maximum speed on the motorway! Sadly, life can be more difficult than that. Still automakers are confident that technology will continue to improve. Mercedes-Benz already offers a car with 23 sensors that can detect guardrails, barriers, oncoming traffic and roadside trees to keep the vehicle in its lane even on roads with no white lines.
A further problem, can be detours and rerouted roads. Google’s bubble-shaped self-driving cars rely heavily on highly-detailed three-dimensional maps which are far more detailed than those often available on Google Maps. As any listener to morning radio can attest, there are commonly news items about detours and rerouting caused by accidents, road repairs and a host of other factors. Maps so easily become out of date – an intersection with a four-way stop might suddenly become an ordinary traffic light or even become a roundabout. Although the pioneers of automated vehicles are confident that new technologies will overcome these unknowns, the solution still awaits implementation for the time being.
Self-driving cars use radar, lasers and high-definition cameras to scan the road ahead for obstacles and the images they generate are assessed by high-powered processors to identify pedestrians, cyclists and other vehicles. But what about potholes? They lie below the road surface and a dark patch ahead could be an oil spot, a puddle or even a filled-in pothole. Google and other companies hope more precise laser-based sensors will make it easier to distinguish between different hazards and even tell the difference between a pothole and a shadow created by an overpass.
Lastly, there’s the issue of making tough decisions. In the midst of heavy traffic, a ball bounces into the road pursued by two running children. If a self-driving car’s only options are to veer right and strike a telephone poll or to hit the children, which is the better alternative? In one the children die, in the other a crash into the pole may kill the car’s occupants. Should the computer give priority to pedestrians or passengers? When a crash is inevitable, a human driver gives a spontaneous reaction in a split-second. But in a car controlled by algorithms, it is a choice which has been predetermined by a programmer. Who or what should play God in these very difficult circumstances?