An update on our work on low latency streaming (2024)

It’s been a while since I last blogged about our work on reducing the delays in live streaming. Since our early demonstrations of low latency streaming four years ago, a lot has been done here in BBC R&D and in the wider industry.

As a reminder, if you’ve ever streamed a live programme over the internet while someone else nearby is watching the same thing on traditional broadcast television, you’ll know that the streamed version is usually a bit behind. There are lots of reasons for this. In some cases, choices we make to give us flexibility add some delay. Other delays help the media flow reliably through our complex systems and through the Content Delivery Networks (CDNs) that we use to allow large audiences to watch simultaneously. Delay also helps prevent the pictures “stalling” when the throughput of your internet connection changes or other devices in your home start downloading something. The latency also varies depending on the model of TV you’re using.

When I last wrote about this, I described how we had been able to eliminate this latency in online streams, to the point that it was as good or potentially even better than traditional TV broadcast. However, putting that method into practice isn’t just a matter of flicking a switch – it was always going to be a challenge, needing change and collaboration across the media industry. Here, I want to give you an update on what we’ve been up to.

Image above fromkorea.neton Flickr, photographerHeo Manjin,cc licence.

Part of the challenge in reducing latency to match that of traditional broadcast is that changes are needed throughout the distribution chain, from streamlining the process of getting video and audio to our encoders to reducing the amount of buffering used by the streaming client in your TV. Along the way, there are changes needed to allow the media to flow more progressively through the content delivery network. And all this depends on agreeing industry standards so that there is a common approach that everyone can support and tests and trials are needed too to ensure that it works across the different kinds of TV and other streaming devices that our audience uses.

So how are we doing in these areas?

Firstly on standards, BBC R&D was one of the main contributors to industry work in DVB updating the DVB DASH standard to support low latency streaming. A collaboration between DVB and the DASH Industry Forum led to a consistent low latency streaming approach for DASH that can be used throughout the world. Following this, BBC R&D also contributed to a new version of the HbbTV specification that underpins Freeview Play TVs in the UK. HbbTV 2.0.3 now includes support for low latency streaming using a flexible approach whereby many of the complexities of achieving reliable streaming are delegated to apps using the W3C Media Source Extensions (MSE) API that is now standard in browsers. TVs including this new functionality are now becoming more widespread in our viewers’ homes.

DASH is not the only standard in use for streaming. HLS is also widely used. The commonality between these two allows for common media segments to be used in some cases. Through our involvement in the CTA WAVE standards work, we contributed to the publication of a DASH-HLS interoperability specification that also covers low latency.

We’ve produced test content for low latency that complies with these standards that we can use for compatibility testing, simulations and trials. More on those aspects shortly.

Over the past few years, we’ve seen commercial CDNs roll out functionality that allows for progressive delivery of media segments. This is an important enabler for achieving low latency streaming with DASH as it allows ‘segments’ of video and audio to progress through the CDN in smaller ‘chunks’ as they come out of the encoders, rather than being held up until a complete segment of around 4 seconds duration is available. Here in R&D we conducted a series of detailed tests to characterise this new functionality across three CDNs to identify any significant limitations and assess its suitability.

One of the biggest challenges to achieving low latency lies in the streaming client – the software that runs on your TV or your mobile phone that tries to avoid playback stalling by choosing different ‘representations’ or ‘renditions’ of the video according to the throughput that your internet connection can achieve. Reducing latency to something comparable with broadcast means reducing the amount of time the client has to react to changes in throughput and that can mean poorer reliability with more stalling. To avoid this we need to eliminate any client behaviours that add delay and optimise algorithms to make the best decisions possible.

BBC R&D’s main focus in this area has been in making open source contributions to the dash.js JavaScript DASH player to improve low latency support as well as testing and assessing different stream switching algorithms through simulations and trials.

That brings me to a key question: if we are testing a potential improvement, how can we tell if it actually makes things better? In other words, how do the things that we can directly measure (like stalling, average picture quality, start-up delay and so on) actually relate to the quality that our viewers feel that they are getting when watching a live stream – the so-called Quality of Experience or QoE?

We have used two QoE models to do this. One is a standard called ITU-T Rec. P.1203. It has some limitations in terms of the range of video codecs and resolutions it can handle. The other has been developed here at BBC R&D and is more broadly applicable (as well as being simpler to calculate). We have applied both of these measures in extensive simulations of different client algorithms using our own test streams and are currently writing up the results of this comprehensive study for publication.

Finally, we have been able to run a small scale trial of low latency earlier this year. We encoded BBC One through an in-house low latency encoder and distributed the media over a commercial CDN to dash.js based clients, configured based on the results of our simulations. The results were encouraging and suggested many of our viewers could get a good quality of experience from a low latency stream with end to end delays matching those of our terrestrial broadcast channels.

However, challenges remain to deploying low latency at a large scale.

Firstly, some users would experience a less reliable stream if low latency were turned on universally today. So one area we will be investigating is how we might identify those for whom low latency should work well and those for whom it would not.

Secondly, the client-side low latency approaches we have been trialling depend on TVs and other devices having support for MSE. It will take some time for MSE support to reach a high proportion of the devices our viewers use.

That said, whilst those issues may prevent our streaming latency reaching parity with broadcast for some time yet, there are still some things we can do in the meantime. For example, there are contributions to latency that can be improved further up the chain. These do not depend on client device capabilities or affect reliability, so incremental improvements for our viewers should be possible as the relevant parts of our streaming architecture are upgraded or changed.

An update on our work on low latency streaming (2024)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Merrill Bechtelar CPA

Last Updated:

Views: 5365

Rating: 5 / 5 (70 voted)

Reviews: 85% of readers found this page helpful

Author information

Name: Merrill Bechtelar CPA

Birthday: 1996-05-19

Address: Apt. 114 873 White Lodge, Libbyfurt, CA 93006

Phone: +5983010455207

Job: Legacy Representative

Hobby: Blacksmithing, Urban exploration, Sudoku, Slacklining, Creative writing, Community, Letterboxing

Introduction: My name is Merrill Bechtelar CPA, I am a clean, agreeable, glorious, magnificent, witty, enchanting, comfortable person who loves writing and wants to share my knowledge and understanding with you.