It’s a scene many of you may remember from your lessons at driving school: a ball rolls onto the street from between two parked cars. What’s the most likely thing that will happen next? All of us, expecting a child to run out from between the cars a few seconds later, would reduce our speed as a precaution. Up to now, cars have been driven by people who react similarly in such situations – by reducing speed. But what if it’s a computer driving the car? A self-driving car has to be capable of doing everything people do at the wheel. Automated vehicles can react faster than any human, they are constantly alert, and they never get tired – of that there can be no doubt. But how good are they at anticipating events?
A self-driving vehicle has to be able to do two things. First, it has to be capable of identifying pedestrians, cyclists, scooters, traffic signs, and – of course – other cars as well. Technically, many of today’s production models are already capable of this. Second, it also has to be able to interpret traffic situations in order to make predictions about the behavior of other road users. Artificial intelligence (AI) will make this possible. A car equipped with artificial intelligence will not only react faster than any human, but will also drive more defensively. This benefits us all, since it makes the roads in our urban areas safer – for pedestrians, cyclists, and, not least, for the occupants of vehicles. Our development goal, therefore, is clear: Bosch wants to help make cars smart.
For many years now, our engineers have been working hard on automated driving. Nearly 3,000 of them are striving to make automated driving reality. Only recently, moreover, we started an alliance with Daimler to put self-driving cars onto our city streets. As the basis for automated driving, driver assistance systems are a fast growing area of business for Bosch. It was only in 2016 that our sales in this area first passed the billion-euro mark – while orders last year were worth 3.5 billion euros. Unit sales of our radar sensors alone will grow 60 percent this year, and those of video sensors by 80 percent.
But now we’re moving beyond sensor technology, and extending our expertise in the area of artificial intelligence. To achieve this, we will be investing 300 million euros in the Bosch Center for Artificial Intelligence over the next five years. This center will employ some 100 experts at locations in India, the U.S., and Germany.
As Mr. Bulander already said, our goal is to reduce accidents to zero. Automated driving helps save lives, since it makes our roads safer, and artificial intelligence is the key to making it work. But we have a lot of ground to cover before a computer on wheels can anticipate and interpret events at least as well as a human driver. Fundamentally speaking, this work involves three crucial steps.
The first of these is understanding: the car has to know what its sensors are detecting. Like a human being, a computer with artificial intelligence first has to learn. In this context, experts speak about deep learning. But while a small child only needs to see a few trucks before it is able to recognize any truck as such, computers in the laboratory have to see millions of commercial vehicles before they can identify a truck. To be viable in road traffic, artificial intelligence has to sift millions of images and reliably identify cars, trucks, pedestrians, cyclists, trees, and other objects – including the ball I mentioned earlier.
The second one is enabling the car to make decisions. Again, a comparison with human learning makes sense. Cars have to be capable of more than perceiving and understanding their surroundings. They also have to learn to anticipate, to guess what is most likely going to happen in the next few seconds. The range of sensor data creates the basis on which artificial intelligence can make decisions. When radar and video data are merged, the image of the car’s surroundings becomes more detailed, allowing pedestrians and their direction of movement to be identified. On this basis, the AI system computes the probability of someone moving onto the road ahead, and initiates braking in good time.
The third step toward self-driving cars is high-resolution maps. We are working on this together with TomTom, the Dutch provider of maps and traffic information, as well as with the Chinese companies AutoNavi, Baidu, and NavInfo. Our vision for the future is that vehicles should use sensor data to keep the cloud-based digital map constantly up to date. We want to create an open standard for this. After all, the empirical knowledge we have gathered suggests that keeping a high-resolution map up to date for freeways in Europe, North America, and Asia Pacific will require vehicle fleets of around one million vehicles each. We have already reached one milestone. To find their way through a city, automated vehicles need to have a high-definition map to know precisely where they are at all times. In this area, we and TomTom have premiered our “road radar signature,” which is based on data from our radar sensors. As cars drive along, billions of radar reflection points are entered into the high-definition map, replicating the course of the road. Automated vehicles can use this signature to determine their exact location – both in their lane and in a wider geographical context – down to a few centimeters, and this even at night and when visibility is poor.
It should be clear from what I have said that data play a crucial role in automated driving. A self-driving car generates huge quantities of data – as much as one gigabyte a second. Processing such huge quantities of data calls for more than classic control units. Instead, a car equipped with artificial intelligence also needs a brain. And in the future, this brain for self-driving cars will come from Bosch. Our AI onboard computer is expected to go into production by the beginning of the next decade at the latest.
This computer for artificial intelligence will guide self-driving cars through even complex traffic situations, or ones that are new to the car. To do so, it will be capable of up to 30 trillion floating-point operations per second – three times as many as a human brain. And with every new situation it encounters on the road, artificial intelligence will learn more. In artificial neural networks, our computer will store whatever it learns while the car is moving. In the laboratory, experts will check that what has been learned is correct. Following further testing on the road, it will be possible to transmit the artificially generated knowledge structures to any number of other AI onboard computers in an update.
To sum up, our objective is accident-free driving. We will achieve this objective with the help of automated driving. Even now, vehicles can react faster than any human being, but they also have to be able to anticipate better than us. The key to achieving this is our AI onboard computer – it will help make the roads in our megacities significantly safer.
If you want to know more about how automated driving will find its way onto our city streets at the start of the next decade, I invite you to visit our “automated” station.