Video telematics with DMS: A complete guide

Posted by: LightMetrics
Category: AI, Video Telematics

ADAS-based electronics systems that aid the driver and enhance safety were the starting point of a technology-led evolution that has brought us on the ramp towards full automation (Levels 1-5). From its humble beginnings with passive driver alerts to the more sophisticated self-driving systems being tested on roads today, the primary focus has been on sensing and reacting to the external environment. Taking a look at some of the most common safety features available in mass-production vehicles today validates that assessment:

  • Pedestrian detection/avoidance
  • Lane departure warning/correction
  • Traffic sign recognition
  • Automatic emergency braking
  • Blindspot detection

DMS (Driver Monitoring Systems) on the other hand monitors the inside of a vehicle’s cabin, more specifically the driver, primarily through cameras pointed inwards. AI algorithms monitor in real-time a driver’s attentiveness, fatigue, drowsiness, and more. As is clear from the J3016 standard, up to Level-3 (and possibly even Level-4) the driver is required to be attentive and ready to take control at a moment’s notice. Human reaction times being what they are, it becomes imperative to monitor the driver at all times, to make sure they are never lulled into a sense of complacency with fatal consequences.

Even aside from the now prevalent view that DMS will be a key component of the journey towards full-automation, enough pressing use-cases exist today across both commercial and passenger vehicles, that make DMS systems a critical component of any OEM and aftermarket safety system. We focus on video telematics, the fastest-growing segment of fleet telematics, and the critical role camera-based DMS will play across three main use-cases: driver identification, in-cab alerts, and effective driver coaching. We also look at the different types of aftermarket hardware available for DMS, and their pros and cons.

Driver identification

The association of a driver to every trip is of fundamental importance to a fleet – whether it be from a compliance perspective (HoS) or analytics for operational purposes like driver safety. This has been typically solved using a variety of modalities, including authentication through mobile applications, NFC cards, post-hoc assignments by the fleet manager, and more. All of them require manual intervention and are therefore susceptible to human error to varying degrees:

  • Trips are left unassigned when the driver forgot her keycard.
  • Drivers are wrongly assigned when drivers changed and the outgoing driver forgot to log out.
  • Fleet manager errors when having to reconcile multiple unassigned trips with different drivers (a laborious, painstaking, and slow process).

A driver-facing camera that is able to reliably identify the driver using face recognition can solve many of the problems seen with manual authentication schemes. The workflow can be broken down into two parts:

  • Driver enrollment: This refers to a process wherein images of a driver’s face are enrolled into a system (with their consent), to help the system ‘learn’ what the driver looks like. A representation of the driver’s facial features is generated and stored securely in a database associated with the fleet. This is done across the fleet for all active drivers and can be done by drivers themselves using mobile apps, or by fleet managers with central access to a driver management portal. A hybrid system is also possible, where enrollment is done automatically when the driver has signed on via another modality like a NFC card, and over time the entire fleet is covered.
  • Driver recognition: This refers to the process of automatically assigning a driver to a trip by matching the representation of the driver’s face against the library of enrolled drivers for that particular fleet. If a match of sufficiently high confidence is found, a driver is assigned to the particular trip. Some regulations like the ELD mandate do not allow for automatic assignment of drivers, so a suggestion to the fleet manager using a separate identifier is often more appropriate.

In-cab alerts

As with ADAS, this is the original use-case for any assistance system – provide feedback to the driver in real-time to correct potentially dangerous behavior. The most common driver events detected by DMS are:

  • Driver distraction: This refers to a broad category of events that take a driver’s attention away from the road – looking at a cellphone, speaking or texting, talking to co-passengers, eating, smoking, etc. By alerting the driver in real-time, potential critical incidents can be averted. Distraction detection algorithms need to be robust across variations like ethnicity, gender, age, etc., and also different vehicles with different relative distances between the camera and driver. This is usually handled through a self-calibration process based on the driver’s face and default driving position relative to the camera. Other parameters like vehicle speed, turn signals, etc. are also critical to fine-tuning driver distraction algorithms to ignore legitimate cases where drivers need to look away from the road for short amounts of time. 
  • Driver drowsiness: This refers to events brought about by driver fatigue causing episodes of microsleep. High-resolution DMS cameras, mounted appropriately, use landmarks around the face and the eyes to detect patterns indicative of microsleep and warn the driver. These algorithms usually have limitations around the use of sunglasses that block IR light (DMS cameras need IR illumination to illuminate the cabin in the absence of ambient light).
  • Seatbelt warning: Detecting seatbelt compliance through signals from the vehicle bus is notoriously challenging across a variety of vehicle classes, and also suffers from the technical limitation that it only detects if the seatbelt is locked, not that the driver is wearing it. AI algorithms can reliably detect the seatbelt signature across the driver’s torso, and provide timed alerts until the driver is compliant.

When combined with ADAS, data from either system can be used to provide more nuance to alerts generated by the other, e.g. when a DMS system detects a driver looking away from the road, ADAS thresholds warning against forward-collision or lane drift can be reconfigured automatically.

In the context of fleet management, DMS alerts are also frequently combined into meta-events to alert the fleet manager. In situations where a driver is generating multiple distraction alerts in a short span of time, automatic notifications can be configured for the fleet manager to manually intervene and avert a potential incident.

Driver coaching

While real-time driver coaching to prevent accidents is critical for accident avoidance, patterns of behavior emerge over time that can only be corrected through driving behavior analytics, along with a manual review of incident videos. The most common enablers of post-hoc driver coaching using DMS are:

  • Severity-based highlights of critical events of driver distraction or drowsiness, helping the fleet managers focus on the most critical incidents easily. A severity for DMS events is usually a combination of intrinsic metrics like angle and duration of head pose and gaze, and external parameters like location, velocity, and inputs from other systems like ADAS. Fleet managers can tag or bookmark events for driver review, which a driver can complete as part of the pre-trip checklist before her next trip, usually through a driver mobile app.
  • Analytics that correlate driver drowsiness with a duration of travel, that can help fleet managers, coach drivers, to take necessary breaks on long-haul trips. Similarly, distraction events can be correlated with time of day, traffic, weather, and other external factors, creating targeted programs for driver improvement.
  • Automatic sub-classification of distracted driving events into categories like smoking, eating, or drinking to provide more actionable inputs to drivers.

Aftermarket hardware

Driver facing cameras for DMS comes in two main variants:

  • Unibody dash cams: This is the most common category of aftermarket dash cams and consists of a road and driver-facing camera integrated into the same unit. While this leads to lower installation complexity, care needs to be taken to ensure that both road and driver-facing views are optimal for both ADAS and DMS. Additionally, cameras should ideally also provide independently rotating swivels for both road and driver-facing cameras, to provide the maximum flexibility in ensuring that appropriate areas of road and cabin are adequately covered by the single unit. It should be noted that for unibody dashcams the position of the camera cannot be directly in front of the driver per most safety regulations, and therefore the view of the driver camera is at an angle and not directly straight – optimal mounting positions are typically off-center and biased towards the driver. Aftermarket DMS solutions have to calibrate for the driver’s position and account for that in the different algorithms that consider the head position and eye gaze with respect to the vehicle’s line of motion.
  • Dedicated DMS cameras: Cameras of this type are mostly popular in OEM DMS systems where a dedicated camera is factory-fitted for optimal coverage of the driver’s head and eyes. In the aftermarket, it is more common to see DMS cameras as a separate accessory off a main road-facing camera or MDVR unit, with the flexibility to mount them independently. The additional installation complexity due to extra wiring and separate mounting instructions is offset by the optimal driver view that can be ensured by a dedicated driver-facing unit.

IR LEDs providing illumination in absence of ambient light are a must for all driver-facing cameras and are designed to provide a sharp view of the driver’s facial features at all times. DMS algorithms consequently must be trained to work equally well on videos captured in normal lighting and under IR illumination, which can vary quite significantly in appearance.

The rapid penetration of video telematics, coupled with increased awareness around the safety issues surrounding driver distraction, has created significant tailwinds for the adoption of DMS by commercial fleets globally. Aftermarket installs also have the salutary effect of improving DMS algorithms through exposure to a large corpus of real-world data across driver ethnicity, age, gender, facial hair, eyewear, and more, and also different environments and lighting conditions. Seen in this light, aftermarket installs are a critical step towards the creation of robust DMS-based solutions by OEMs, and might even be precursors to many innovations that end up later in assembly-line installed systems.

As DMS-based safety systems now firmly enter the mainstream, it’s time for telematics service providers and fleets to actively embrace and integrate them with existing solutions and workflows. Write to us to find more about the DMS features available in RideView.

Author: LightMetrics

Leave a Reply