Despite the impressive strides made by the automotive industry in recent years, fully autonomous cars may still be decades away. What’s already clear, however, is that they will be powered by two key components: artificial intelligence (AI) and analysing data through the lens of time. In tandem, these technologies will enable automakers to perfect the most important aspects of autonomous driving: anomaly detection and training AI models. Completely driverless cars won’t become the norm until manufacturers meet very high safety standards, meaning vehicles can accurately identify and predict potential problems before they turn into safety hazards for passengers.
This is where time series data and AI are poised to impact the automotive industry.
Building intelligence with time series data
Time series data—time-stamped data sequenced chronologically—has become an invaluable resource for automakers, just as it has across many other industries. Sensors producing time series data are now ubiquitous in daily life, creating smarter cities and covering factory floors. These sensors enable real-time data collection, transformation, and response to create highly intelligent systems.
Simply put, raw data from sensors is foundational for real-world AI systems. When it comes to autonomous vehicles (AVs), analysing this data in real-time provides an understanding of the world around them so they can drive safely—knowing if a light is red or green, keeping an adequate distance from other cars, following traffic signs, maintaining a safe speed, and so on.
Creating true automotive intelligence by gathering, transforming, and reacting to data is a continuous and complex process. Systems need to constantly refine, validate, and transform data, and then simulate models that create a foundation for real-time decision-making. Real-time streaming data from connected sensors also gives manufacturers insight into how vehicles perform at any given moment. Combining this data with powerful algorithms helps automakers monitor performance levels across their fleet of cars and anticipate customer needs.
The data that drives autonomous cars
At its core, AI is about manufacturing intelligence at scale, and time series data contributes to that intelligence by providing chronological context across data sources. AI models are critical to the success and safety of AVs, and they consume vast amounts of data streamed from cameras, LiDAR, radar, and other sensor sources.
Sensors cover virtually every angle of AVs and are responsible for understanding vehicle surroundings. They monitor conditions at 360 degrees, providing autonomous systems with a significant advantage over human drivers. Compared to the human field of view, which is roughly 210 degrees horizontally and 150 degrees vertically facing forward, a vehicle with a comprehensive, 360-degree sense of awareness has the potential to be significantly safer.
Then, with large volumes of data collected, automotive engineers can train the AI models to respond accurately and in real-time—long before they send vehicles out on public roads. Take object detection, for example. When a vehicle identifies something outside the expected norm, such as a ball rolling into the street or a garbage truck backing out of an alley, it needs to make a decision and take corrective action instantly.
Using this data at scale allows manufacturers to create highly intelligent, self-healing systems. The intelligence continuously builds from multiple data sources, so it becomes smarter, faster, and more precise. However, because these streams of data never stop, the AI systems that support AVs need to be built on a platform that can support high-volume, high-cardinality time series data. Consider a sensor that measures up to 50 different data points every millisecond. Now consider an autonomous car with as many as 40 sensors on it. Those sensors produce high-cardinality data that grows exponentially every minute.
Relieving the pain points of AVs
While the ongoing development of AVs is an incredible feat of engineering, challenges remain even as businesses improve AI and data management. First, there is an infinite number of anomalies that could impact driving conditions. Considering that vehicles with basic driver assistance have only been a reality in the past decade or so, there are several anomalies these models don’t include. More data is required to train and improve models, and this process will continue indefinitely. The overwhelming bulk of data is used for model validation, meaning recording data around anomalies at a pace and scale that enables automakers to make significant strides in anomaly identification becomes a challenge.
As the need for more anomaly data expands, AI models must evolve to increase the options available for corrective action. For example, the default behaviour for today’s assisted driving models is to stop when a significant anomaly is detected. That may be the safest course of action for the person in the car, but it may not be the safest or most efficient action for other cars on the road. Humans understand that rather than stopping on the motorway when there’s an object in the road, it’s better to go around it.
Another challenging variable for AVs is weather. GPS and other technologies can help mitigate the effects of weather, but significant advances will be necessary to ensure consistent performance, especially in rainy or snowy conditions.
This is where fully autonomous vehicles can take advantage of their omnidirectional, 360-degree sensors to assess anomalies and read the environment to determine the safest and most efficient corrective action. The time series data produced by these sensors make this real-time analysis and response possible.
Beyond the data used and collected by individual vehicles for real-time decision making, another important area will be how vehicles share data with each other as more vehicles on the road become autonomous. This includes things like vehicle-to-vehicle(V2V) and vehicle-to-infrastructure(V2I) communication where information like speed, direction, traffic congestion, and braking status gives each vehicle more information to make decisions.
Raw data from sensors is foundational for real-world AI systems
This data is also important for historical analysis because it will allow for improved route optimisation and accident avoidance. For example, if an increased rate of accidents is noted at a certain location for AVs (perhaps due to a failure in the underlying AI model), by sharing this data other AVs can choose alternate routes that avoid situations where self-driving vehicles struggle. The collective intelligence gained from sharing time series and other types of data among AVs leads to a self-improving system where each vehicle’s decisions are enhanced by the aggregated experiences of others, thereby improving overall traffic coordination and safety.
The confluence of time series and AI
Ultimately, collecting, managing, and analysing time series data are table-stakes capabilities for the real-world as instrumenting the real-world operation over time is the base-level data required to accurately build predictive models. When it comes to autonomous cars, the intelligence manufactured by this data and AI will deliver vehicles that are, in theory, more intelligent—and consequently more capable and safer—than human drivers. Data will run through increasingly sophisticated learning models and serve as a foundational component. On a grander scale, autonomous cars will embody this intelligence, but it will also manifest across many other industries and applications.
The data revolution is coming to the automotive industry. Automakers face several hurdles before autonomous cars become commonplace on the roads, but together, time series data and AI provide a path forward.
About the author: Evan Kaplan is Chief Executive of InfluxData