Live – The World is Watching

It’s part of the human experience. We all want to see events happen in real-time. We want to be a witness so we can say to each other “Did you see Messi’s goal last night?” If we can’t be there in person, on a seat in the stadium watching our team or listening to our band then we reflexively grab the remote and tune in. And, if history is a teacher, our appetite to be there, via a live broadcast, knows no bounds. The trends here are simply staggering.



Image Credit: AP

Let’s take a quick look back at what we’ve watched together:

  • On 20 July 1969, 530 million people watched the first humans ever to walk on the surface of the moon (this constituted around 14% of total population of the world at the time)
  • A satellite broadcast of Elvis Presley shown live from Hawaii, titled ‘Aloha from Hawaii’ on 14 January 1973, is reported to have reached 1 billion viewers globally
  • The 2008 Summer Olympics is the current record holder for a multi-day broadcast. It is estimated by Nielsen Media Research that up to 4.7 billion individual viewers (70% of the world’s population) watched some part of the coverage
  • The 2011 Cricket World Cup semi-final between India and Pakistan is reported to have been watched by about 1 billion people by the Guardian

More recently, look at the audience for both the Super Bowl and Olympics over the last 8 years.



Source: Olympics.org, NFL.com

Okay, let’s agree right now to set the Elvis data point aside. The pattern is otherwise clear, we want to witness and celebrate the best in human achievement together.

Now it’s time to put this audience online.

We live in a truly unique time. More than nine out of ten people on earth own a mobile phone. Smart TVs, iPads, slates, laptops are everywhere. We can reasonably expect there will be an event in the near future that essentially everyone on the planet will watch live.

Moreover, given trends, it is most likely the majority of viewers will be streaming that live event online.

But this inevitable event, when the world stops to watch, won’t come easy. No, we have a lot of work to do. Hard problems to solve.

Ever been frustrated when the experience of watching a live sporting event streamed online falls way, way short of your expectations? You’re not alone. It seems everyone who has tried to stream a live event either gave up well before the final score was posted or, perhaps worse, had to suffer through endless rebuffering events and disappointingly grainy video. During the 2013 Super Bowl, for example, less than 1% of viewers streamed the game online. The average stream duration was 38 minutes and the average number of stream interrupts was 3. This online performance will certainly not inspire the rest of the world to tune in next year.

When you see this, is there a little voice in the back of your head saying, “Hey, why do I need to wait?”



One of the most memorable images of our day. So Sad.

We’ve waited too long. We’ve been far too patient. Far too understanding. Isn’t it time for a change?

Today’s Live Streaming Solution – Stop play, injury time out

Today, live events are streamed by content providers and commercial CDNs that send a long distance unicast stream to each and every viewer. These redundant streams are massively inefficient as they require a dedicated HTTP session for every subscriber. These streams consume enormous capacity on operator networks and, when the event is popular, can slow traffic on the network to a crawl. It’s no surprise, given how live streams are handled today that most consumers are disappointed with the live streaming experience.

Live Caching of unmanaged OTT Video: A very hard problem to solve

The solution is clear. We need to cache live OTT streams deep inside the operator network. However, a system that can dynamically cache live, unmanaged OTT video streams inside the operator network presents a very hard engineering problem. Unlike VoD, caching of Live Streams requires some very clever engineering given the need to preserve the real time experience. When engineering a cache for live streams, time is of the essence. Thankfully, no one knows video like Qwilt. Given our caching experience and performance with Video on Demand, we had a head start on solving the live stream cache problem before the first line of code was written.

Qwilt Live Stream Cache takes the field

Today, Qwilt announced the industry’s first solution to dynamically cache streams of unmanaged and managed live video content: Qwilt Live Stream Cache. This revolutionary functionality enables network service providers to optimize both OnDemand and Live Video traffic by deploying a single Video Fabric Layer, the QB Series, across the subscriber edge of their network. Once deployed, this Video Fabric will intelligently classify traffic, cache popular Live Streams and VoD content, and then locally deliver that content the next time it is requested by a downstream consumer. The result is a network that is optimized for both video delivery and consumer Quality of Experience. Everyone in the ecosystem wins, the Content Provider, CDN, Network Operator and Consumer.

The secret in Qwilt’s Live Stream Cache is our ability to detect popular live OTT video streams and instantly direct those streams into the Video Fabric Controller’s onboard FastCache, a dedicated control and storage path optimized for quick delivery of live streams. The Qwilt Live Stream Cache then establishes a local live video transmission point in each neighborhood that can serve a very large population of nearby subscribers using a single seed stream from the origin server, offloading significant strain on the operator’s network.

Play on.

High quality live event streaming is the Holy Grail of online video. At that critical moment, when Messi steps up for the penalty kick and the championship is on the line, you don’t want the phrase “rebuffering” to be seared into your memory forever. You want to see the goal. You want to be there for the win.

Live Stream Cache. More magic brought to you by Qwilt.

No buffering. No delays. Play on.



Image Credit: AP

Leave a Reply

Your email address will not be published. Required fields are marked *

Mark Fisher

Mark Fisher

More posts by this author >> Posted on Thursday, January 30th, 2014

Twitter

@qwilt