Automobiles were primarily mechanical and operated by people. Nowadays, many sophisticated technologies are assuming the majority of the tasks formerly done by drivers. Numerous manufacturers are incorporating some autonomy or intelligence into their automobiles. Every major automaker has been developing concept vehicles with self-driving or autonomous capability.
This article looks at the technology behind autonomous vehicles and answers some essential questions about how autonomous cars affect car insurance.
When it comes to autonomous vehicles, many alternative methods exist. It depends on the kind of technology that the manufacturer wants to deploy.
Technology behind Autonomous Vehicles
The autonomous tasks have been classified into three major categories, which are as follows: Perception of the environment, Motion Planning, and Control of the vehicle are the three main areas of focus. Within each of these categories, some particular technologies or processes assist the industry in developing solid solutions.
Perception Of The Environment
Every activity that captures information about the world around a vehicle at a specific point in time then converts that information into a valuable set of data that will be used to make decisions and instruct the control algorithm. How best to control the vehicle safely is considered part of the perception of its environment.
What steps must be taken to get information from an environmental perspective?
According to Merriam Webster, a sensor is a device that detects a physical stimulus (such as heat, light, sound, pressure, magnetism, or a specific motion) and sends an impulse that is relevant to the detected stimulus.
Without sensors, self-driving cars would be impossible; in addition, they would have a hard time seeing and feeling their surroundings and gathering the information required to drive safely. This data is also processed and evaluated to construct a route from A to B and provide the necessary instructions to the vehicle’s controls, such as acceleration, braking, and steering.
Information such as the route ahead, traffic congestion, and any obstructions on the road gathered by sensors in autonomous vehicles is also exchanged among automobiles that have been linked through M2M technology.
The following sensors are the most frequently used in an autonomous vehicle:
- Inertial Measurement Unit
- Camera sensors
- Radio Detection and Ranging (Radar) sensors
- Light Detection and Ranging (LIDAR) sensors
The IMU will provide the vehicle with changes in acceleration, angular velocity, and orientation. This critical information will allow the vehicle to make an informed decision.
Information from cameras, RADARs, and LIDARs will help us make sense of the world. For object identification and classification, cameras can offer relevant information, while RADARS and LIDARs provide information about the distance between the vehicle and adjacent objects.
How do they work?
Just like human drivers, autonomous vehicles use cameras and sensors to “see” and “understand” the things on the road in front of them. Cameras are equipped in each vehicle that provides real-time information and a 360-degree view of the environment, which results in the broader image of traffic situations.
In the digital age, digital cameras with 3D capabilities are often used for presenting impressive and lifelike pictures. With image sensors embedded in the vehicles, these vehicles identify things, categorize them, and calculate the distances between them and the vehicle. This is something like an example: for example, the cameras can detect other vehicles, pedestrians, bicycles, traffic signs, signals, road markings, bridges, and guardrails.
Areas for improvement
Although modern camera sensors are far from ideal, this problem is unavoidable. Inclement weather such as fog, snow, and rain obstruct cameras, increasing traffic accidents. In these circumstances, the computer doesn’t sound judgment on what the vehicle should do since the pictures from the cameras aren’t good enough. The algorithms will sometimes fail when a variety of objects are all coloured similar to the backdrop or when the contrast between them and the background is reduced.
Radio Detection and Ranging (Radar) sensors
How do they work?
Radar (Radio and Range Detection) sensors contribute critically to the entire autonomous drive function. They transmit radio waves that detect objects and measure their distance and speed in real-time concerning the vehicle.
Sensors are typically used around the vehicle, both for short and long-range radars, and each has its purposes. While short-range (24 GHz) radar applications provide blind-spot surveillance, the optimum lane maintenance help, and parking aids, the long-range (77 GHz) radar sensor functionality includes an automated remote control and braking aid. In contrast with camera sensors, when detecting objects during fog or rain, radar systems are usually no problem at all.
Areas for improvement
There is little doubt that the Pedestrian Reconnaissance algorithm will need a great deal of development because only 90-95% of the pedestrians are correctly identified by the automobile radar sensors deployed in current vehicles. Furthermore, currently commonly used 2D radars cannot correctly identify the height of an object since the sensors scan only horizontally, which may be problematic if they drive beneath bridges or road signs. To address these problems, a broader range of 3D radar sensors is presently being developed.
Light Detection and Ranging (LIDAR) sensors
How do they work?
Lidar sensors operate like radar systems. The main difference is that the sensors use lasers rather than radio waves. Besides calculating the distances to different things on the road, lidar may generate three-dimensional pictures of the items identified and map the environment. In addition to depending on a limited field of vision, lidar may be set to generate a complete 360° map surrounding the car. These two benefits are reasons for choosing lidar systems by independent car manufacturers like Google, Uber, and Toyota.
Areas for improvement
Because rare earth metals are required to create sufficient lidar sensors, they are considerably more costly than autonomous sensors. Autonomous driving systems may cost far over $10,000, while Google and Uber’s best sensors cost up to $80,000. Snow or fog may occasionally obstruct lidar sensors and adversely impair their capacity to identify road items.
The Global Navigation Satellite System (GNSS) for Motion Planning will supply the system with the required information on the vehicle’s location in the globe for one appropriate sensor suite.
How does Motion Planning work?
Motion planning mainly aims to move the vehicle from point A to point B safely. Any factor interacting with the vehicle may seem like a simple job. Does it just move straight from A to B? In truth, this is a challenging job. In the mission of the automobile, you may modify limitless components and variables. The vehicle must evaluate every solution that is available to decide best on each stage of the route.
Motion planning has been classified into three major sections: the mission planner, the Behavioral Planner, and the local planner.
The mission planner’s primary aim is to establish how to travel to a specific highway location. You may think of the mission planner as a map, give the car’s routes, and plan based on many variables, such as traffic.
The primary goal of behavioural planning is to comprehend the car’s present situation and provide recommendations for safe action.
The vehicle is going to consider the following information on behavioural planning:
- Road rules.
- Kind of road
- Climate Conditions
- Cyclists or pedestrians nearby
It should decide how long a traffic light or stop signal should take to move safely. The behavioural planner’s responsibility is to track the vehicle’s speed, front vehicle speed, decelerating to stop, staying stopped for a while, or merging into the lane.
The Local Planner
The local planner must decide correctly if the vehicle can speed up, decelerate, or steering safely to carry out the actions already given in the behavioral plan.
As an example, imagine that you are driving away from your office. The mission planner aims to trace the main route. The behavioural planner’s responsibility is to understand whether you are in the parking or the street. It is going to keep you stopped until you are ready to rejoin the road. Once the behavioral planner determines that it is safe to rejoin the road, the local planner’s responsibility determines the speed at which it will enter, whether it has enough time to do so or not. How much time it will need to turn left or right. Once all of these decisions are made. The local planner will communicate with the Vehicle Control, which will send the appropriate information to the vehicle.
Control of the Vehicle
The management of the car is perhaps the most basic job of all. If you can’t manage the vehicle, it doesn’t matter how well you detect the object in the vehicle’s surroundings or how accurately they are planned because it will be unable to carry out that plan, regardless of how precise it is.
Vehicle control can be accomplished through two different components, the mathematical model and the algorithm control.
The mathematical model will shape the dynamics of the vehicle and enable the computer to anticipate the vehicle’s behaviour. Computers that evaluate and verifies if the steering and brake system is in proper condition due to its safety criticality. Usually, these same computers offer the interfaces required for lateral and longitudinal speed communication and manipulation.
It should enable the vehicle to manage the way it moves reasonably but to have a teleoperated capability. The car will need the model of the system and the control algorithm to move by the command in such a manner that it mimics an autonomous vehicle.
Mathematical models and controllers are unique for every type of car with its specific features, such as wheel size, width, length, height, and weight. They are designed to provide information on a mathematical (dynamic/cinematical) model that provides the ability to understand and predict the behaviour of the vehicle given a particular stage.
For example, if you try to stop from 62 mph, your car won’t do the same as if it’s trying to break at 6 mph. Whether you hit the brakes at 10 percent or press 100 percent, you won’t do the same. If the road is wet, slippery, dry and hot, it will also alter. A competent mathematical model may understand all these limitations.
The controller is software that takes the model, the action, and the current vehicle status and calculates the car’s behaviour, searches the correct values necessary to reach the intended action to be sent to the controllers.
What Impact will Autonomous Vehicles have on Insurance?
Will we still need Insurance if autonomous vehicles make driving safer? Yes, in a nutshell. Insurance will continue to be essential, but it will need to evolve. Technology may help reduce human mistakes while driving, it is not perfect. Tesla already has a track record of deadly vehicle accidents using its autopilot feature.
Self-driving vehicles function by assessing road conditions and adjusting driving behaviour using a system of cameras, radar, laser sensors (called lidar), and other technologies. Waymo, for example, has cameras that can view in all directions up to three football fields. Additionally, Waymo vehicles are equipped with software that predicts the movement of anything near the vehicle, including bicycles and pedestrians. If any of those systems fail, a crash may occur. Autonomous cars are unquestionably going to revolutionize the insurance business. Car accidents should reduce as we move toward completely autonomous vehicles. Insurance premiums should decrease as a result of fewer accidents.
Autonomous vehicles will not take to the roads overnight. The transition to fully autonomous cars is occurring gradually across five degrees of automation. Nowadays, most automobiles include either level 1 or level 2 automation: cruise control, electronic stability control, forward collision warning, automatic emergency braking, and self-parking. The Audi 8 was the first mass-market vehicle to achieve level-3 automation.
If we are on the verge of a world with level-5 autonomous vehicles, there will come a time when human-driven and self-driving automobiles coexist on the road. Insurance companies will continue to face significant risks. As self-driving vehicles become more prevalent and result in reduced rates, insurance firms’ business models will need to adapt.