Video live streaming: Putting you in the driver’s seat

Posted by: LightMetrics
Category: Uncategorized
Video Live Streaming

The value the fleets derive from video telematics takes many forms, though it is predominantly led by the need to have clear video evidence when crashes or other incidents occur, and increasingly with the advent of on-device AI, the ability to track, measure, and improve driver performance through real-time and offline coaching. Taken together, they form the core value proposition for advanced video telematics, which is driving the rapid adoption of video by fleets globally.

As video telematics makes the leap into emerging markets, and also within certain segments of developed markets, the driver-centric view is augmented by an asset-centric view of video telematics – one that gives fleet managers the assurance that they can see what their assets are doing and where they are, in real-time. From a vehicle-security standpoint, video live streaming becomes critical to the overall success of video telematics in such markets. Consider a delivery fleet manager who wants to know why one of her vehicles is idling for too long at one location, and instead of calling up the driver can now stream one-click video and see for herself. In markets where vehicle theft is an issue, the benefits of having the ability to be in the cabin virtually in real-time are apparent.

As our partner base has grown, spanning Australia, South Africa, LATAM, and North America, one of the most common feature requests we heard across the board was for the RideView platform to provide support for video live streaming. As with any enhancement to the platform that benefits a wide section of our customer base, we listened to partners and end-users, thought about the best experience we can provide, and are happy today to announce the availability of video live streaming under RideView. Before we got to where we are though, we had to get a few other things done.

Live GPS tracking:

A precursor to getting a real-time virtual view of the road and cabin is finding where the vehicle is at any given moment – otherwise known as live GPS tracking. Before we even started working on live video telematics we had to build out the infrastructure, and the front-end experience that would let end-users track their assets in real-time. This is also important for our partners globally who can now use a dash cam as a GPS tracking device, thereby leading to hardware convergence and better overall solution costs for end-users.

The broad goals we set out to achieve were:

  • Provide real-time updates from assets in the field
  • Enable fleet managers to track all active assets in one place
  • Continuous flow and interpretation of events
  • Build the front end framework for live video streaming

We use Apache Kafka as the underlying framework for integrating live updates into RideView, being one of the most popular distributed event streaming platforms available. It helps capture data in real-time from dash cams in the form of streams of events, and store, manipulate, process and react to them in real-time.

As a sample implementation of this workflow, on the RideView fleet portal, we have a new tab LIVE-VIEW that corresponds to all real-time data streamed from dash cams. It displays a list of all registered fleet assets, along with their status – Green denoting live assets that are on a trip, and Red denoting that they are immobile (ignition-off). The last known location of all assets is shown, and for live assets, the location is updated every 30 seconds. Clicking on an asset helps focus the view on a particular asset and track its motion in real-time.

Fig. 1 – Live GPS tracking for all fleet assets

Live Video Streaming:

Knowing the real-time location of an asset, while critical, is only half the equation. A dual dash cam allows a fleet manager to remotely stream road and driver-facing views, and be an eye-witness to whatever is happening at any given moment. For every asset that is on a trip (Green on the map above), a fleet manager can request video in a variety of configurations:

  • Resolution – 180p, 360p and 720p
  • Format – road, driver, side-by-side, picture-in-picture

Fig. 2 – Live video streaming

Fig. 3 – Video streaming configurations

Driving our live streaming APIs is the AWS Kinesis Video Streams service. The server to browser protocol we use is HLS (HTTP Live Streaming), driven primarily by the following factors:

  • The stream is downloaded as small files over HTTP, hence it can use existing HTTP infrastructure
  • It is designed for reliability and dynamically adapts to network conditions by optimizing playback for the available speed of wired and wireless connections
  • Provides wide support across various platforms

Given the data-heavy nature of video streaming, setting the minimum acceptable resolution and formats (e.g. side-by-side video is 2x the size of either road or driver views) is necessary to extract the most value from this feature while still staying within prescribed data limits. Caps on monthly streaming duration are another way to make sure that fleets avoid expensive data overages. Future enhancements to the live streaming module include support for driver initiation of streaming (to draw a fleet manager’s attention to something in real-time), and audio capture to provide more context.

Part of the reason the RideView platform is so versatile and extensive and meets the needs of a diverse end-user base is that we inculcate feedback from the best companies around the world. It is this feedback loop that has allowed our platform to keep evolving and remain at the forefront of advanced video telematics. If you want to know more about live streaming or the other features available in RideView, drop us a note at

Author: LightMetrics

Leave a Reply