Live streaming latency: As Super Bowl preps its largest live stream ever, the OTT industry scurries to catch up to broadcast

Tools
Live panda stream National Zoo Smithsonian

Pictured: Hot panda action, live
from the National Zoo, a la early
2000's live streaming.

We've seen live streaming come an awfully long way: from the early days of watching pandas mosey in and out of frame on simple, bandwidth-starved live cams to delivering this weekend's Super Bowl championship game to multiple millions of online and mobile viewers, the difference is huge.

But providers are still trying to solve a nagging problem: delays in a streamed video's delivery that result in viewers seeing a play about 30 seconds after broadcast viewers see it -- and sometimes that delay is as long as two minutes.

With CBS streaming the Super Bowl for free this year, analysts expect online numbers to top last year's record 1.3 million viewers (for comparison, the AFC Championship's live stream on Jan. 24 drew 1.2 million online viewers). "For the sake of fans everywhere, let's hope they are more successful putting the world's most popular television event to the live stream test," said IneoQuest in a blog post this week, noting that last year's live stream by NBC was rated only "mostly OK."

The impact of millions of viewers in the U.S. alone on Super Bowl streaming quality, buffering and even latency could be quite noticeable -- if CBS and its delivery vendor partners haven't taken audience scale into account.

IneoQuest big game buffer rage

In a study conducted by IneoQuest and Research Now into video buffering -- a distinct possibility for live-streamed video, besides latency -- two-thirds of the respondents rated their frustration with buffering video a seven out of ten, and two out of five would wait less than 10 seconds before presumably giving up and either switching to broadcast or heading for the nearest bar to watch the game.

How we'll watch the game

CBS will make the Super Bowl stream available on desktops and tablets via its website, as well as on its CBS Sports app on Apple TV, Roku, and Xbox One. (Verizon smartphone users will be the only viewers able to live-stream the game on their mobile phones, using the NFL Mobile App.)

The broadcaster's digital unit has been preparing to stream the Super Bowl for the better part of a year, according to Jeff Gerttula, SVP and general manager of CBS Interactive.

"What we've had to do is go through … every piece of the stack and work to optimize it. From a stability standpoint we're making sure we're eliminating any single points of failure, making sure we have redundancies built everywhere, and from a latency standpoint we're making sure that there's no unnecessary inefficiencies," Gerttula said. "For us, we're inserting ad tech, we're inserting viewer tracking. There's a lot of data components with this and making sure we're testing each of those and optimizing along the way is really important to reducing latency."

To get its stream out to online audiences, CBS Sports is working with a number of third-party providers as well. For example, Level 3 Communications will manage video acquisition, encoding and transport. "In total, more than 1,400 hours of video content will be acquired, encoded and transported across Level 3's Vyvx VenueNet+ platform and managed by Level 3's Vyvx Operations as part of the Super Bowl coverage," Level 3 said in a media statement. The transport provider has supported numerous facets of broadcast delivery of the game for 27 years.

Once video of the game is captured in San Francisco, it traverses to CBS' studio facilities in New York City and then goes out for encoding and delivery. "For our hosting we're working with Akamai on that side. And we're using a number of third party firms for tracking," Gerttula said.

The cross-continent transport along with the additional steps required naturally introduce latency into the process -- something CBS hopes it has been able to reduce as much as possible.

Solving the latency puzzle

Because an OTT video has to go through several processing steps to, ironically, be delivered faster to more devices, delays creep into the delivery process. For example, Chris Knowlton, Wowza's streaming evangelist, pointed out "adaptive bitrate streaming which typically has 30 to 40 seconds of latency from glass to glass."

That's in addition to the initial encoding process. Further, the Super Bowl live stream will have to pass through gauntlets that Level 3 and other video accelerators like CDN providers can't fully control: the connection to the last-mile network. That final stage before reaching a viewer's tablet or streaming device can present some issues that delay delivery even further -- things like bottlenecks at the interconnect, slower network speeds in the Internet service provider's network, and issues with a home gateway.

It's not a stretch to say that every delivery provider in the industry is working on a solution to live streaming latency. "How we can actually reduce that so as people start putting on more live events, especially if it's a sporting event that's time sensitive … to reduce the lag from that perspective?" Knowlton said.

Lining up all the components needed to efficiently process and deliver a live stream can be expensive, particularly for a traditional broadcaster like CBS that is supporting legacy infrastructure as it builds an OTT empire.

Streaming providers look to the cloud to decrease latency

While live streaming has been around for well over a decade, demand for high-quality, reliable OTT streaming has only just begun to build. With the cost of doing so firmly in mind, cloud-based OTT services continue to attract providers looking to live stream.

"For example, we've gone from -- in 2013 we didn't have any what I would call next generation OTT cloud based trials of any kind. No proof of concepts," said Charlie Vogt, CEO of Imagine Communications. "We ended last year with 110."

A recent study by research firm Devencroft Partners found that cloud deployments have transitioned from a "curiosity" to a "top five" technology project by media and broadcast providers. Many of the companies surveyed have projects under way or completed. "We're also seeing planned cloud deployments of "serious" media operations such as playout, compute, workflow, and MAM," said Joe Zaller, founder and president of Devencroft, in a blog post. "… Perhaps most interestingly, we saw the term "confidential" more than ever when we asked people about their plans to use for virtualization and cloud technology in broadcast and media operations. Based on what we see and hear in the market, we're taking this as an indication that that trials and projects are already underway."

Devencroft Partners evolution of cloud deployments in media

Imagine recently expanded its relationship with HP Enterprise, integrating its software more extensively with HPE's hardware products, many of which are used by networks as well as video service providers. It's a move that is in line with what Imagine saw as an inevitable extension of demand for cloud-based services from service providers to broadcasters and networks. Notably, the company signed a deal in spring 2015 with Disney/ABC Television Group to transition its playout, delivery and network operations for all its programming from premises-based infrastructure to Imagine's cloud-based architecture.

Other companies are looking for traction in the online video space by targeting cloud delivery, with some employing growing technologies of software-defined networking (SDN) and network functions virtualization (NFV). Both promise additional automation and efficiencies in overall cloud computing operations, and could have a telling impact on OTT video services. IBM, for example, has been building its cloud presence, firmly placing its stake in the OTT game by first acquiring Clearleap in December and then buying UStream, a delivery vendor that was already providing the online video platform component of its IBM Cloud.

"What we hear from customers is what they really want is a high quality platform that has global scale and has well-thought out API that they can tap into," said Brad Hunstable, CEO of UStream.

Taking stream optimization to the device

Optimizing encoding and delivery is important, but according to Giraffic's Ravid Hadar, head of marketing for the Tel Aviv-based company, the next best place to improve live-streamed content quality is in the end device.

Giraffic has partnered with Samsung and LG to integrate its acceleration software into their smart TVs -- one in three of the smart TVs produced by these manufacturers now contain their product -- and the vendor announced at CES that it will be doing the same for smartphones and tablets, with plans to launch its product, AVA Mobile, at the Mobile World Congress show.

Giraffic's video acceleration strategy has three elements: analyzing the network's capacity over several seconds; predicting the quality level of streaming video over the next minute or so; and determining the highest consistent video quality. "For example, let's assume that for 10 seconds we can get 4K quality. But then for the next 60 seconds we can only get HD quality. We will target downloading the HD. Because we believe that not only is the highest quality the important one, but also the consistency of the quality of the download. And then after we decide what quality we're going to download for the next 10 seconds, what we do is implement that acceleration strategy."

Being able to analyze at the end device is important because delivery vendors, CDNs and other acceleration services can't control what happens in the last mile network. Giraffic's product can analyze a video's quality after it passes the home gateway and reaches the user's device, and optimize what's available.

Giraffic multisource streaming vs. standard HTTP

Giraffic claims a higher throughput of online video on devices using its acceleration
software.

"The first advantage is to maximize what you can obtain of the home network. For example there are many issues related to the last mile which cannot be solved when integrating into the server," Hadar said. "We are (ensuring) the streaming process will receive the right priority out of the resources that are connected to the network. … For example, a network supports 30 Mbps. But if you're looking at the Netflix index you see that most providers get in peak hours 3 to 4 Mbps maximum. To support HD video you need more than 8 Mbps. To support 4K you need more than 60 Mbps.  So what we are doing is, if you have 20 or 30 Mbps in your network, we are supporting users to get as high as possible out of this 20 to 30 Mbps network. We may not get as high (as that) … but we are able to get almost the maximum out of the network.  This is something that can only be done from the manufacturer device side."

Grabbing worldwide appeal

While sports presents the best case to grab online viewers, other media genres and industry verticals are increasingly using live streaming.

Hunstable noted that enterprises "increasingly are starting to look like those (media) companies. What I mean by that is … for content marketing, for product launches, they're doing video as well inside their organizations."

In the entertainment world, pre-awards show red carpet ceremonies are frequently streamed live ahead of the scheduled television broadcast.

Another rapidly-growing area for live streaming is eSports. Turner Networks and now ESPN are dedicating broadcast and live-streaming bandwidth to covering high-profile gaming events. It's an arena that didn't get much attention from mainstream media providers before the rise of live-streamed gaming site Twitch.

And live streaming even plays a role in areas some would consider mundane. Wowza's Knowlton pointed to the education vertical as a key growth area for live streaming. "It's great that you can do adaptive bitrate streaming but if you're trying to have feedback with a professor (teaching a lecture) I don't want to wait 30 seconds to get feedback from my question. I want to hear what they just said."

With a growing number of use cases for live streaming, closing up the gaps in quality assurance is becoming more important than ever before.