Sensor areas for environment observation

Automated vehicles are kicking things up a gear

WE are well and truly into the 21st century at this point, and a common refrain from pretty much everyone at some point as been “Where’s my flying car?”

Flying cars have proven to be impractical for a variety of reasons, but development of self-driving cars has been continuing nicely – meaning in the future, we’ll eventually be able to let the car handle the mundane driving stuff while we get stuck into some serious gaming instead.

Driverless trains have been around for a long time – The Docklands Light Rail in London opened in 1987 and cities around the world from Vancouver to Kuala Lumpur to Istanbul have installed automated trains.

Trains, of course, only go in two directions (forwards and backwards) and stop at predetermined points. Cars are an entirely different kettle of fish altogether – but that doesn’t mean we aren’t well on the road to having self-driving vehicles, either.

The Society of Automotive Engineers International divides vehicle automation into five (well, technically six) levels.

Level Zero is no automation at all, and this was the state of affairs for pretty much all cars produced before about 2010. The driver is 100% responsible for pretty much everything to do with operating the car. Standard cruise control doesn’t count as it is unable to react to changes in environment (eg the car in front braking suddenly).

Level One is Driver Assistance – things like lane departure warnings and adaptive cruise control (where the cruise control does react to changes in the environment, automatically slowing down or applying the brakes in the event it is necessary). It can steer or brake/accelerate, but not both at the same time.

Level Two is Partial Automation – according to the US National Highway Traffic Safety Authority, “An advanced driver assistance system (ADAS) on the vehicle can itself actually control both steering and braking/accelerating simultaneously under some circumstances.  The human driver must continue to pay full attention (“monitor the driving environment”) at all times and perform the rest of the driving task”.

Level Three is Conditional Automation, essentially “eyes off” – the car can handle all the aspect of driving under set conditions (such as in slow-moving traffic on a dual-lane carriageway), but needs the driver to be able to step in at short notice.

Level Four is a more or less fully autonomous vehicle, although there may be situations – such as a serious rainstorm – where the driver needs to take control.

Level Five is your Sci-Fi movie-type fully autonomous vehicle, where the onboard systems handle everything with no input at all from the occupants – to the point where there may not even be controls in the vehicle.

The Audi A8 is one of the most advanced production passenger vehicles on the market, capable of conditional autonomous driving under certain circumstances.

German car manufacturer Audi has been investing considerable resources into autonomous vehicles and driver assistance technology, and recently launched the Audi A8,  one of the most advanced production cars on the market in this sense, with a number of advanced features – according to Audi, it is the first series-production automobile in the world to have been developed for conditional automated driving at Level Three.

According to the automaker, on highways and multi-lane motorways with a physical barrier separating the two directions of traffic, the Audi A8 can – via its traffic jam pilot – handle the driving task in nose-to-tail traffic up to 60 km/h; handling everything from starting, accelerating, steering and braking within its lane. 

From a technological standpoint, once traffic jam pilot is activated, the driver can take their hands off the wheel and foot off the accelerator and the car will handle the rest – although the driver must be capable of taking control when the system asks, and local laws may dictate whether the driver can do things like take their hands off the wheel or use a mobile phone.

A camera in the car checks to make sure the driver is capable of stepping back in if needed, using anyonymised data measuring head position and eye movement; if driver control is needed, visual and acoustic warnings are made and if they are ignored, the car will apply the brakes on its own and come to a stop in its lane.

The car has a central driver assistance system, which is about the size of a tablet PC and continually merges the signals from all the vehicles sensors into a differentiated model of the surroundings through its high-powered processors.

The Audi A8 has 24 sensors aboard for environmental observation and monitoring

Fully equipped, the Audi A8 has 24 sensors:

  • Twelve ultrasonic sensors on the front, sides and rear,
  • Four 360-degree cameras on the front, rear and exterior mirrors,
  • One front camera on the top edge of the windscreen,
  • Four mid-range radar sensors at the vehicle’s corners,
  • One long-range radar sensor on the front,
  • One laser scanner on the front, and
  • One infrared camera (night vision assist) on the front.

The laser scanner in the A8 fans over an area of about 80 meters in length, according to Audi, with a wide aperture of 145 degrees. Within this range, the scanner detects the exact contours of objects, even in conditions of darkness.

“The special capabilities of the laser scanner and the central environmental model in the central driver assistance controller benefit the navigation system, in addition to the Audi AI systems, since the sensor data merger locates the car to within its exact lane. The driver assistance systems react to objects with even greater precision and earlier than in the predecessor model when they detect the end of a traffic backup and initiate braking, for example,” Audi said.

This is all extremely impressive and most of us would agree it’s very much the sort of thing cars should have by now, but it turns out developing automated vehicles is a lot more complex than it sounds.

Mikloss Kiss is Audi’s head of advanced development for automated driving and said right now the industry overall was at Level Two, preparing to make the leap to Level Three in the very near future.

“We are really, really close,” he said, explaining the current technology allowed a car to operated hands off (but not eyes-off) at speeds up to 60km/h under ideal conditions.

 “The next step is to bring that to hands-free driving on the freeway at 130km/h.”

Level Two technology has obvious applications in places like Australia, where it’s a long way to anywhere by road, helping combat driver fatigue – but it’s not an excuse to mentally tune out.

“If the driver gets drowsy because of a long boring drive, they won’t drift,” Dr Kiss said.

“But you have to be able to step in in a second (if something happens).

Dr Kiss explained automated (or automation-ehnanced) vehicles, including the A8, perceived their surroundings typically via some combination of LIDAR and/or SONAR, and a camera.

The LIDAR sensor in the Audi A8 helps the on-board system build an accurate ‘picture’ of what is in front of the car.

One of the things that helps the car is a LIDAR (Laser Instrument Detection And Rangefinding), which works by sending up to 150,000 laser light pulses each second at a surface. A sensor in the LIDAR gear detects how long it takes for the pulse to ‘bounce back’ to it, and uses the information to develop a ‘map’ or image of whatever surface it is scanning – giving the onboard systems an image of what the car is “seeing”.

Surprisingly, the on-board systems in a number of automated vehicles – including the A8 – are complemented by BlackBerry software. While best known for being a big player in early smartphones, the company now provides “mission critical” software for a range of companies using systems that absolutely must work, and be updated reliably and effectively – such as those controlling automated vehicles (or the information systems in them).

Intel are also heavily involved in automated vehicles, with their company MobilEye has been developing high-definition digital maps for smart vehicles to use in China and Korea.

Dr Kiss said alongside the technical challenges of creating truly autonomous vehicles, there were also the legal issues – such as who was deemed to be ‘in control’ of an autonomous vehicle if there was an incident – as well as how the car should react in uncertain situations.

One example he gave was the question over how an automated car should behave if it is stopped at a traffic light and an emergency service vehicle approaches with its lights and sirens on –  should the automated vehicle move out of the way to let it through, or stay where it is even if it is blocking the emergency vehicle’s path?

The reality, Dr Kiss said, was that fully automated vehicles on roads were decades away at best – although it was possible we would see vehicles capable of parking themselves in an enclosed space such as a parking garage in the foreseeable future, as that was an easier challenge to overcome due to the enclosed and controlled nature of that environment.

Fully self-driving on-road vehicles are still in the realm of science fiction at present, although that is expected to change in the next decade or so (Image: NASA)

The challenges of self-driving cars continued to expand, he said, with new challenges presenting themselves almost as soon as existing ones are overcome, and there was a growing realisation it was likely going to require a concerted and unified effort from car manufacturers, universities, and technology companies to meet those challenges.

Ultimately, it’s going to be quite some time before you can let the car handle your daily commute while you pwn some noobs in whatever game has taken your fancy, it will happen eventually – although whether it’s in the next decade or longer remains to be seen.

Still, that means plenty of time to practice your elite gaming skills at home or while someone else drives in the meantime…