Driver assistance systems

Bosch Automated Mobility Academy | Course 1

With driver assistance systems, drivers are given the support they need to reach their destinations safely and with less stress.

In this course, the vehicle learns how to perceive and process its surroundings, and how to use these new skills to detect potentially hazardous situations and respond via alerts and/or basic intervention.

A car that has mastered driver assistance is able to:

  • Continuously monitor the entire surroundings of the vehicle millisecond-to-millisecond using sensory perception
  • Instantly detect imminent hazards
  • Immediately respond to imminent hazards via alerts and basic intervention functions

These skills allow the car to significantly enhance safety on the road and mitigate the negative consequences of distracted driving. These features also dramatically enhance driving comfort, enabling the driver to arrive at his or her destination less stressed and in a better mood.

Lesson 1.1

Know your surroundings

With zero automation, the vehicle relies on the driver to see, understand and respond appropriately to lane markings, road signs, and other traffic signals. As the vehicle begins supporting the driver with driver assistance systems, the car itself needs to know these things as well. Bosch gives the car sensors, software and other advanced technologies to make this possible.

Local sensing (on the vehicle)

Driver assistance systems perceive the vehicle's surroundings using sensors, such as radar, video, ultrasonics, and MEMS (micro electromechanical systems), to interpret the information. They enhance driving comfort by taking over monotonous and tedious driving tasks from drivers. Additionally, these systems increase ride comfort and road safety by supporting drivers in complex or critical situations requiring quick and safe action.

sensor_base_2 sensors_cameras sensors_mrr_rear_side_1 sensors_ultrasonic_1 sensors_lidar_1

Sensor fusion for a clearer view

Today's and future driver assistance systems combine information from different sensors in the vehicle. By intelligently fusing data from the incoming sensor information, the benefits of various sensor principles are used to optimum effect. This sensor fusion yields more detailed and reliable information about the vehicle's surroundings than would be possible with individual sensors.

Multi-camera system

Video cameras can have one or two lenses (for 2D and 3D imaging respectively), and can be combined and configured in numerous ways to achieve near- and long-range vision all around the vehicle in virtually any lighting conditions.

Long- and mid-range radar

Radar sensors utilize radio waves to determine the position, speed, and size of objects within a certain range. Bosch offers tailored solutions, including both long- and mid-range radar, designed to enable standard use of radar sensors across all vehicle segments.

Ultrasonic sensors

Ultrasonic sensors detect near-range objects, and work on the pulse-echo principle; the sensors transmit short ultrasonic impulses which are reflected by obstacles.


Lidar uses a light sensor (laser) to create a 3D digital map of the area surrounding the vehicle.


MEMS sensors: environmental sensing for smart mobility

In order to be able to always make the right decisions in real time, vehicle control units receive information gathered from numerous MEMS sensors. These hidden heroes demonstrate their superpowers when it comes to detecting even the tiniest changes in their environment. Their precise measurements ensure that the system is always up to date on what is happening both inside and outside the vehicle.

Learn more ›

Sensing beyond the line of sight

While on the road, nothing is more important than knowing what’s coming next.

To make automated driving a reality, it's not just about the communication happening in the car between various systems and technologies. The vehicle must be able to connect and communicate with the world beyond.

Through cellular connections, the connectivity control unit (CCU) manages connections with the external world and cloud. The CCU enables services such as emergency call (eCall), concierge service, and stolen vehicle tracking. It can also can serve as a hotspot, enabling mobility users to access WiFi while in the vehicle. Mobility users expect a seamless transition between home, car and work. A CCU enables wireless connectivity of these spaces and the automated vehicle becomes a 3rd living space.

Enabling the car to see beyond the line of sight is imperative to maximize the efficiency and safety of an automated vehicle. A vehicle-to-anything (V2X) module allows vehicles to communicate with the world around them, even beyond the line of sight. Using dedicated short range communication (DSRC) or cellular V2X, vehicles are able to communicate with other vehicles, infrastructure and pedestrians. When V2X data is combined with local sensor information, a holistic view of the surroundings is enabled through sensor data fusion.

While driving, nothing is more important than knowing what’s coming next. Connected horizon helps drivers identify hazards before they appear. Connectivity with the cloud enables highly detailed topographical data, plus information on road and traffic conditions – in real time. As soon as something changes, the driver knows about it.

Connected functions, and especially highly automated driving, call for on-going updates throughout the vehicle’s entire life cycle. Vehicle software patches and updates can be carried out via the cloud, without ever having to visit a workshop. The communication control units and central gateway computer provide the necessary transmission and encryption technologies to make over-the-air updates possible, and help ensure functional safety and data security of the vehicle against unauthorized access.

Lesson 1.2

Human vs. machine sensory perception

Machine perception refers to the ability of a machine (in this case, the car) to see and understand its surroundings, similar to how a person uses their senses (eyes, ears, touch, etc.) to relate to their environment.

In order to provide the most value for drivers, there must be an understanding of how they behave on the road, and how the car's specially-engineered componentry and systems allow it to enhance (and often dramatically surpass) human capability.

To put it simply, cars have superhuman senses; they see further and react faster than humans can.

In Course 4 , we'll discover how artificial intelligence allows the vehicle to interpret and make decisions based on this sensory information, but for now, let’s take a look at some of the basic differences between human and vehicle perception.



Even if a driver's car mirrors are adjusted to avoid blindspots, our standard front-facing, single-focal-point vision makes it impossible to simultaneously view the road ahead, side mirrors and rear-view mirror.


A vehicle equipped with 360-degree sensing technology can "see" in all directions around the car, and utilizes multiple types of cameras and sensors to verify and understand the surroundings, even in the dark or in hazardous conditions, such as fog or snow.

Attention and multi-tasking


Distracted driving is a leading cause of road accidents. According to an American Psychological Association study, distractions and multitasking while driving take "attention away from the ability to process information about the driving environment... and limit working memory," both of which are critical for safe driving.


Vehicles can give 100% attention in all directions and to all tasks without ever getting tired or distracted.

Response times and effectiveness


According to an UMTRI publication, it takes an average person more than 1.5 seconds to see, understand and respond to an unexpected hazard. Even if the human response is timely, the braking force or evasive maneuver applied by the driver is not sufficient to prevent a collision in many situations.


Vehicles equipped with automated technology can almost instantly respond to imminent hazards and take appropriate action to avoid or minimize the impact of an accident.

Application 1.1

Avoiding collisions

Driver assistance systems for safety applications include alert systems and basic intervention functions.

Alert systems

If the vehicle detects a potential hazard or obstacle, alert systems can notify the driver by means of visual, audible and/or haptic signals.


Basic intervention systems

If a vehicle detects a hazard or obstable and the driver fails to take appropriate action, the vehicle can intervene automatically to prevent or mitigate the consequnces of a collision.


Application 1.2

Comfort and convenience

Although comfort and convience functions can certainly make life on the road safer, these functions are primarily intended to reduce the driver's stress, for instance, by taking on basic speed and steering tasks to keep the car on track.


Congratulations, you're through Course 1!