What is an Autonomous Vehicle?
Autonomous vehicles (AV) are otherwise known as self-driving cars. With an assortment of sensors and a powerful computer, the vehicle is able to interpret its environment and safely navigate itself around traffic without human intervention. A completely autonomous vehicle has no need for a steering wheel. While most new vehicles on the road have some sort of autonomous feature such as adaptive cruise control or lane assist, complete automation is still in its infancy.
To understand the landscape of AVs, it is important to understand the ranking conventions adopted by the automotive industry.
Level 0 — No Automation
At this level, the only automation available is cruise control. All driving tasks (steering, braking, and acceleration) are manually handled by the driver. This level represents your standard vehicle that relies solely on driver input.
Level 1 — Driver Assistance
Here, the automation is more so aimed to assist the driver and driver input is expected at all times. Features like lane keep assist, blindspot monitoring, and adaptive cruise control work to fight against driver fatigue and create a safer driving experience. Here the driver is supported with information rather than steering/throttle intervention. For an interactive explanation of these features and more, check out Toyota’s safety sense demo page.
Level 2 — Partial Automation
Partial Automation is very similar to driver assistance. However, at this level, the vehicle is much more equipped to handle steering and acceleration tasks and will take control of steering, acceleration, and braking in very limited situations. At this level the driver is still expected to have hands on the wheel at all times and remain attentive.
Level 3 — Conditional Automation
Here, cars are able to drive themselves, make turns, and brake efficiently. However, this level is moreso fundamental and the models driving autonomy are not very robust and require the most ideal conditions. While the vehicle itself is able to perform most driving tasks, the driver must be ready to take over when the vehicle cannot autonomously navigate itself.
Level 4 — High Automation
This is a truly autonomous vehicle. The models driving autonomy are robust and, at this level, the vehicle can efficiently interpret its environment and safely navigate with no driver input. However, these vehicles still have a steering wheel, pedals, and all features that would allow the driver to manually pilot the vehicle. In the most extreme circumstances, the driver will have to perform all the driving tasks.
Level 5 — Full Automation
At this level, the vehicle is purely autonomous and can handle all driving tasks with no human intervention. Those inside the vehicle will not participate in any driving tasks and there is no need for a steering wheel.
How does an Autonomous Vehicle Drive Itself?
Typical AV applications require a collection of sensors to accurately perceive the world around them. With stereo cameras, LIDAR (light detection and range) sensors, and radars, the AVs can accurately localize themselves within their environment to make well-informed steering and throttle decisions without human intervention.
Stereo cameras aim to replicate human vision. With two adjacent lenses, the camera is able to create a 3D-map of the environment and track moving objects. The stereo camera also provides a continuous stream of image data to supplement computer vision, thus allowing the system to read road signs, traffic lights, and classify objects in the road. This sensor alone can provide enough information to navigate the AV. However, responsible applications require additional sensors as a redundancy if the stereo camera fails.
Much like the stereo camera, LIDAR sensors create a 3D mapping of its environment. LIDAR sensors scan their environment by emitting lasers in all directions and detecting the reflected beams. This data is later fused with the stereo camera data to supplement the system’s object detection algorithm.
Additionally, a typical AV will have multiple short range and long range radar sensors pointed in every direction. By emitting electromagnetic waves, the radar is able to reveal how far away an object is and how fast the object is moving.
By combining all the data from the aforementioned sensors, the AV is able to consolidate the data into a singular model to create the most accurate representation of the environment. Using this model the AV’s machine learning algorithm is best equipped to plot paths, detect objects, and adhere to traffic regulations.
How Big is the Autonomous Vehicle Market?
By 2026, AV’s projected global market value is expected to reach $54.23 billion dollars; almost doubling 2019’s evaluation of $24.1 billion dollars. Given that the highest level of autonomy available to consumers sits at level 3, there is plenty of room for the market to grow. With massive companies, such as Toyota, Adui, Tesla, Volvo, Apple, Google, Uber, Lyft, and more, making notable progress in development, this market is expected to be very competitive. Interestingly, StartUs Insights surveyed more than 1,116,000 global startups and emerging companies to find the top ten automotive trends in the next coming years. From the tree graph below, it’s clear to see that AVs are at the forefront of the industry at 21%, closely followed by Connectivity (18%), Electrification (17%) and Shared Mobility (14%). All of which are closely related to AVs.
Key Benefits of Autonomous Vehicles
Connectivity: In 2019, the United States alone observed 36,096 fatal motor vehicle crashes. Developing a fleet of connected vehicles enables all the vehicles on the road to communicate with each other and if cars could pass along positioning and velocity data, the AV could anticipate and react to dangerous situations much faster than any human ever could.
Electrification: In a recent study by BU’s Institute for Sustainable Energy, researchers estimated that by 2050 the likely net increase in electricity demand from converting the light duty vehicle fleet to 85% electric and AVs will be between 13% to 26% of today’s total electricity demand. With such computationally heavy processes, the traditional power supply in most modern gasoline powered vehicles will not be sufficient enough to meet the power demands an autonomous system requires. From the graph below, it’s clear to see that the transportation industry accounts for 28% of all CO2 emissions in the United States. Even though the automotive industry is manufacturing more electric cars than ever before, decarbonizing the industry will be a slow process. Exclusively electrifying AVs can greatly accelerate the transportation’s sector’s decarbonization.
Shared Mobility: Shared Mobility concerns demand-driven vehicle-sharing, much like the services offered by Uber, Lyft and ZipCar. Individuals can either carpool in a single vehicle or share a vehicle over an extended period of time as a rental. Congested cities can greatly relieve their traffic with AV shared mobility. The population of largely congested cities, such as Los Angeles, typically use private vehicles as their preferred means of transportation. Moving away from the stigma of public transportation, AVs can encourage the population to use more shared transportation services. Additionally, detaching people from their personal vehicles can greatly reduce emissions and make public transportation more available to everyone.
Pandio as an Autonomous Vehicle Solution
With continuous streams of data in live traffic, real-time control algorithms need to be robust and efficient. A system failure can have dire consequences and data needs to be precisely managed to complement the control algorithms. Pandio’s distributed messaging system efficiently leverages the computational and storage capabilities of Apache Pulsar to create a platform specifically tailored to machine learning applications and data at scale. With enhanced security, infinite data auditing, and low latency, Pandio is a top tier solution to supplement any AV application.
Keep an eye out for the second iteration of this autonomous vehicle series. In the next article, we will break down how data is fused to supplement the real-time control algorithms and how those algorithms drive the AV!