nhoch

How low is the new high for live content!

Blog Post created by nhoch on Jun 6, 2017


By Thibaud Regeard , Limelight Solutions Engineer

 

This article is the first out of three articles diving into the world of video streaming, and the challenges of providing a user experience that is consistent with the very nature of live content.


Why is that? You might have experienced watching a live soccer game through your favorite OTT app or website, enjoying smooth quality and cheering for your team getting closer and closer to the opponent goalkeeper… Then suddenly you hear your neighbor screaming, notifications are popping up on your phone: “GGGOOOAAALLL!!!” Wait, there is no such goal… Oh, there it is! 30 seconds later, you see it happen and all the fun is ruined. Does that ring a bell?

 

 

Latency is a term that is commonly used throughout the Internet. Something that is related to your connection with remote content servers and can make or break your experience with online content. Low latency makes sure that users can enjoy it to the fullest, and this is especially true when facing time-critical content, be it online gaming or gambling, live auction, security feed monitoring, or live video streaming.

 

So what exactly is it when we talk about Latency? A term often measured in milliseconds, latency - for the purposes of discussing delivery of online video -  is defined as the average total time that is takes the internet, or a content delivery network, to send video to whatever device you will view it on.

 

Taking it from the definition, it’s pretty straightforward that low latency is important especially if you are experiencing online content. Low latency can be impacted by many factors: the type of internet connection, your router/switch, Internet Service Provider, number of users, or the distance between your computer and the server. While a fair amount depends on the state of the network and the route used to reach the content, we will focus today on the inherent latency created by the technology layer used to bring the content online.

 

How live broadcasters can improve their speed

 

Time is money. Whether it’s high frequency financial trading or live content streaming, you want to reduce the time needed to transport you content from its source to your eyeballs. Distributing video content over the top has been an increased concern as it embraces a variety of use cases: you probably don’t want to learn about the latest touchdown on a live broadcast BEFORE you actually see it happen on Twitter? For scenarios like this, you would probably appreciate, if your security team would have video monitoring feeds closer to real-time in order to deal with disruptions or criminal activities.

 

For this kind of audience, low latency streaming is not only a nice-to-have it is also a necessity. Latency dictates the delay between the live event as it is happening, compared to when the video or audio appears on the device you are using.  Key audiences who can benefit from low latency video are:

 

  • Gambling and gaming users – get news before the odds change
  • Financial organizations – getting the breaking news first can mean being ahead of the markets
  • News organizations – breaking the news first
  • Live Broadcasters – reducing the delay of a live event when viewed across multiple devices
  • Auctions - no point in being too late!
  • Security monitoring – being able to react on time can save lives or valuable assets!

 

The end of Flash technology?

 

For years now, many companies have relied on RTMP (Real Time Delivery Protocol developed by Adobe and mostly used along with Flash) delivery for their streaming business. They needed the low-latency aspect of RTMP delivery, as HTTP-based live streams have larger overhead for playlist and video segmentation, which can delay live playback for almost a minute. This type of latency is not acceptable for real-time communication such as live auction feeds or video chat.

Did you know that Facebook Live uses RTMP in its mobile apps to push a live stream out to its CDN? Anyone who is streaming live content is very likely using RTMP to push the video feed.


The Real-Time Messaging Protocol (RTMP) was designed for transmission of audio, video, and data between Adobe Flash Platform technologies, including Adobe Flash Player and Adobe AIR. Companies who rely on the Flash plug-in for desktop browsers are becoming less, as most of the big web companies like Google, Mozilla, Apple and Microsoft are already pulling the plug on Flash, phasing out its support in their most recent browsers.


In most of these live-streaming scenarios, the Flash plug-in is not necessarily used in a desktop browser though. RTMP can be used as a transport in a mobile or desktop application without requiring the Flash plug-in.
So Flash is still there, flexing its muscle where the action is and keeping the data moving when there isn’t room for processing delays. Despite all its shortcomings, we must acknowledge that Flash can be better than the competition in some use cases.

 

While some CDN vendors have already shutdown their RTMP capacity, Limelight still maintains RTMP endpoints for ingesting live streams. In most of these live-streaming scenarios, though, the Flash plug-in is not necessarily used in a desktop browser. RTMP can be used as a transport in a mobile or desktop application without requiring the Flash plug-in.
As the popularity and footprint of Flash declines, RTMP delivery is expected to fall off and eventually be replaced by optimized HTTP delivery.

Outcomes