Skip navigation
All Places > In the Limelight Blog > Authors jthibeault

In the Limelight Blog

14 Posts authored by: jthibeault

Note: Some of the content in this post is taken from “The State of Online Video” report. Download the complete report here.

 

There has been so much hype and talk about over-the-top (OTT) video services like Netflix the past months that you would think traditional, linear broadcast television is dead. But that is far from the case.

 

Figure 1.gif

Figure 1: Time Spent Per Day with Video by US Adults, by Device (2011-2015)

 

According to eMarketer (Figure 1), although online video consumption has grown annually (some times as high as 70%), the majority of video (approximately 85% in 2015) is still via the television. In fact, looking at the graph above, online video consumption has actually been slowing down even as it eats into traditional TV time.

 

Still, we can’t deny the fact that there is a fundamental evolution happening. Carrying eMarketer’s graph to its logical conclusion, we could argue that at some point in the distant future, online video will replace traditional television. What we once considered the “broadcast television experience” will eventually migrate to online delivery.

 

This evolutionary shift is exactly what we wanted to explore in our recent survey and report, The State of Online Video. For this report, we surveyed over 1200 consumers from the U.S., Canada, the U.K., and Australia across a wide range of topics related to online video including:

  • How much they watched
  • From what devices they primarily consume content
  • What frustrates them about watching it
  • How they feel about advertising

 

What we uncovered were some startling results that point to a generational shift in how video content is consumed—younger Millennials (18-25) not only consume more online video content, but they do so from more devices, and they are much more forgiving when it comes to online video frustration (as a result of buffering).

 

Younger Millennials Consume More Video Content

Although online video is clearly becoming an increasingly large portion of Internet traffic, and despite a seemingly growing preponderance of cord cutters, linear broadcast still accounts for the vast majority of the “television” experience.

 

But if our survey results are any indication, a shift has already happened and is being propelled generationally. When asked how much online video they watch each week, the majority of consumers (Figure 3) indicated 1-2 hours per week.

 

Figure 2.jpg

Figure 2: How Much Online Video Do you Watch Each Week Respondents (18-25)

Figure 3.jpg

Figure 3: How much Online Video Do You Watch Each Week Respondents (18+)

 

And yet when we correlate the data against demographics, it appears that as the respondents skew younger (Figure 2) the more online video they watch, with the majority indicating 4-7 hours per week. Based on this data, it might be said that the younger Millennials represent a “tipping point” for television—a transition from traditional, broadcast delivery to online—which will only continue to accelerate with subsequent generations that grow up with the availability of content anywhere, anytime, from any device.

 

Devices are the Gateway to More Online Video

Each year the proliferation of mobile devices such as smartphones continue to increase as Figure 4 illustrates (developed using data from the Consumer Electronics Association and the U.S. Census Bureau).

 

Figure 4a.jpg

Figure 4: Smartphone sales in the United States from 2005 to 2015 (in billions)

 

But according to our survey results, despite the growing popularity of smartphones and other mobile devices, the personal computer is still the preferred means by which to consume online video (Figure 5).

 

Figure 5.jpg

Figure 5: From which device do you watch online video? (Rank in order of frequency, 10=most frequent)

 

But much like online video itself, the device used to consume content is a generational activity as well. As the demographic skews younger (18-25), smartphone increases significantly in usage (by approximately 10%) indicating that it may well take over the laptop/PC as the predominant device to consume online video content over the next five to ten years.

 

Buffering Frustration Decreases with Age

The length and number of times a video buffers has long been linked to abandonment rates. According to a 2012 study from the University of Massachusetts at Amherst, for example, audience abandonment increases with buffering—viewers begin to disappear after a two second delay with 6% leaving each second thereafter. The data from the university study correlates with what we discovered from our survey respondents—buffering is the number-one cause of frustration when watching online video (Figure 6).

 

Figure 6.jpg

Figure 6: What is the most frustrating aspect of watching video online? (Rank in order of frustration, 10=most frustrating)

 

But where the university study examined the impact of the length of time of a buffering event on abandonment rates, we looked at the other side of the coin—the number of buffering events. What we found when we asked the question, “how many times will you allow a video to buffer before abandoning,” was startling.

 

Figure 7.jpg

Figure 7: How many times can an online video buffer before you abandon it? (Select one) respondents aged 18-25

Figure 8.jpg

Figure 8: How many times can an online video buffer before you abandon it? (Select one) respondents aged 18+

 

As indicated by Figure 7 and 8, the majority of all respondents (Figure 8) would allow a video to buffer twice before abandoning but when the demographic skewed younger (Figure 7), the majority was willing to allow a video to buffer three times before leaving. This insinuates that younger Millennials have a much higher “frustration threshold” than the general population perhaps indicative of a deeper understanding (or forgiveness) of the technology required to enable online video.

 

Conclusion

Online video, while growing in leaps and bounds, represents a complicated set of consumer behaviors that is exacerbated by demographic segments. Although we asked a wide variety of questions spanning attitudes towards cable providers, online video consumption activities, and even about advertising in online video, we still don’t have a complete picture of consumer behavior.

 

What is abundantly clear is that we are witnessing a tipping point in the traditional television experience as younger generations skew towards watching more online video, using more OTT services, and posting video to more places.

 

In short, younger Millennials are embracing online video as the primary means by which to consume and share content and are, by that nature, driving significant change in the way that everyone experiences traditional broadcast television. It will be interesting to see, over the next decade, how prevalent these behaviors become in subsequent generations and the impact they have on behaviors outside of those generations that already embrace this new experience.

 

For more information about these findings and other results from the study, we invite you to download the complete report.

A few years ago, StreamingMedia.com posed the question, “Streaming to All Devices: Is It Worth It?” Although the content may be a bit dated, the question remains germane—how should you approach delivery to multiple devices? Should you even worry about delivering to all of them?

 

In today’s hyper-connected world, consumers are getting online more often, through more devices, and staying connected longer! Via phones and tablets and laptops and TVs, users are accessing a surprising amount of content from everywhere. And that trend shows no sign of slowing down. In fact, in Cisco’s mobile data projections, by 2016 mobile devices will consume 140 exabytes annually of which over 70% will be video.

 

In re-reading through the StreamingMedia.com article, it comes to a great conclusion—delivering to all the mobile devices creates a significant amount of complexity (and expense) in a video publishing workflow. When you think of it from that angle, “expense” is a critical concern because you are responsible for your workflow. Your gear. Your software. Your problem. And when the device market continues to fragment (embroiled in what seems like an on-going conflict over video encoding standards) it makes the problem even more daunting.

 

In that case, the question posed by the Streaming Media article makes a lot of sense. There is only so much time in the day to read about new formats, optimize conversion based on best practices, deal with troublesome content encoding, keep your hardware or software up-to-date, etc. You have to remain focused. Get the most “bang for your buck” as they say. And so it seems on the surface that you should focus on the devices with the most impact (i.e., iPads and iPhones). As indicated in the infographic below (an excerpt from a larger graphic developed by Verizon Digital Media Services), Android devices are vastly more fragmented than iOS creating a significant problem for delivering video.

 

verizon_vdms_info_crop2.png

 

But that fragmentation shouldn’t influence your video conversion strategy. Ignoring any part of the Android market, for example, is essentially alienating your potential audience. Because even as devices are fragmented, consumers are more so. According to Conviva’s 2014 Viewer Experience Report, your digital audience is accessing your online video throughout the day using a multitude of devices:

  • 6-10:00AM: Mobile devices – 6.9% of all video streamed daily
  • Noon-4:00PM: PC – 16.3% of all video streamed daily
  • 7-11:00PM: TV – 36.6% of all video streamed daily

 

Android fragmentation is only the tip of the iceberg when you think about all of the other devices to which you can make your online video available.

 

So what ultimately happens if your video isn’t available on the device users are employing? Missed opportunity. But not just for purchasing and ad impressions, for your brand as well. If you consider Facebook’s brand uplift and video advertising impact study, the “miss” is pretty catastrophic with respect to recall and awareness. As indicated in Facebook’s analysis of Nielsen data for online ad video (Figure 1),

 

…people who watched under three seconds of the video ad created up to 47 percent of the total campaign value, and people who watched for fewer than 10 seconds created up to 74 percent, depending on the metric. That means that while lift continued to increase the longer people watched, people didn’t have to watch a whole video to be affected by the ad. Even video views under 10 seconds effectively build awareness and drive purchase intent.

 

neilsen.jpg

(Figure 1)

 

It would seem that every device on which your content is not available is a ding against your brand. Users don’t care about your problems with video conversion. Like children demanding a favorite toy, “they want to watch whatever they want, whenever they want, wherever they want.” Sure, I am extrapolating Facebook’s analysis to all kinds of video (rather than just video ads). But I feel confident concluding that every time your video isn’t available, your brand suffers in awareness and recall just as much as does your monetization. Think about it. If you are generating ad impressions in your videos or charging users a subscription or even enabling click-through purchases, if you aren’t available on every device, your opportunities for generating revenue from your content are consequently limited. It’s probably safe to say, then, that not being available on every device has a negative impact on your business—from brand to revenue generation.

 

Perhaps the question that StreamingMedia.com posed then should be amended to something like, “Streaming to All Devices: Can You Afford Not To?”

 

So what can you do? It’s just not feasible, in light of the growing number of devices and the fragmentation within those devices, to tackle this problem yourself and focus on your core business at the same time. I’ve talked to countless customers who have expressed that exact concern. They know they can’t afford not to be on all devices but they can’t afford (either in CAPEX or OPEX) to take on the responsibility of getting the content there themselves.

 

That’s where service providers step in.

 

We are at an inflection point in the technology of video delivery. For the first time, there are enough cloud-based resources and software sophistication to enable “publish once, deliver anywhere” functionality. With adequate geographic distribution of those resources, content publishers should be able to get their content to any device without batting an eye. In fact, the onus for transforming content and ensuring it’s delivered falls completely on the service provider.

 

We have seen companies like Encoding.com and Ooyala tout this exact value proposition. But part of the equation of getting content to every device is having the resources to deliver it. Many of these companies partner with CDNs for delivery. Their “publish once, deliver anywhere service” is, for the most part, an “add on.” If the trend for video consumption is truly any device, anywhere in the world then the technologies to transform that content should be intrinsic to the very nature of delivering it.

 

As an example, Limelight has launched publish once, deliver anywhere functionality for both video-on-demand (VOD) and live streaming. But it’s not necessarily something that customers sign-up for as a separate product. It’s an aspect of the core streaming and delivery service (Orchestrate Video and Orchestrate Delivery) that Limelight already offers. Customers publish content to our network for delivery around the globe. Why shouldn’t that content be transformed to the right format when users request it? In that sense, the functionality of “publish once, delivery anywhere” is integrated with the very technology that is used to deliver.

 

So can you afford not to deliver to every device? The simple answer is, “no.” Not only are you missing out on brand uplift, but also purchase opportunities and even other forms of monetization. But should you have to assume the burden of ensuring your content can reach all those devices? As we pass this inflection point, the simple answer is again, “no.” Service providers like Limelight are putting their global compute resources, delivery network, and software to work around the globe so that your content can be automatically available for delivery to any device. For once you’ll be able to reap all the rewards of having your video available on any device (the brand awareness, the monetization, the uplift) without any of the hassle of doing it yourself.

First things first. Grab a towel and, yeah, you guessed it…Don’t Panic!

 

All Douglas Adams humor aside, coming under DDoS (Distributed Denial of Service) attack is no laughing matter, even if you have invested in the best security hardware that money can buy in front of your web server. That’s because an attack so close to your origin can have serious implications for the rest of your network as well. A high volume DDoS attack could interfere with other network traffic getting out (i.e., emails) as your bandwidth, routers, and switches all get saturated with tons of “bad” traffic. And don’t go to sleep at night thinking that your CDN is going to solve the problem all by itself either. Sure, a big enough network can absorb some of an attack but it can’t differentiate between good and bad traffic and if the targeted page is dynamic in nature (aka, uncacheable content) all requests are going back to origin anyway.

 

Which is why the potential volume of traffic shouldn’t be the only thing to concern you. DDoS attacks are moving up the stack. Where they were once primarily in Layer 3 and Layer 4 (syn-flood, ping attacks, etc.) they are now moving into Layer 7 in which automated scripts are attacking login pages and other web forms, mimicking real-user behavior and making it a lot harder to detect when an attack is happening. In fact, DDoS attacks are evolving from single, one-off events into sophisticated, persistent campaigns (as evidenced in the Radware video below).

 

Given the growing sophistication (and size) of attacks then, what can you do?

  • Shut down your website—definitely not the answer. This will only delay the attack that may still be targeting a specific page on your origin. When you turn it back on, the attack begins anew!
  • Look at server logs—this is the first place to start addressing your attack. Identify the page or pages that are being requested the most and isolate those them. Remove them if possible. Setup a “Sorry, we are currently experiencing technical difficulties” page.
  • Call your ISP!—if you are hosting your website offsite, call your ISP or datacenter and let them know you are under attack. Although they may already be aware (with a big enough DDoS attack, their links are going to saturate with bad traffic as well), they can potentially quickly address the matter by modifying settings on their routers and switches to drop your traffic (which, unfortunately, means your good traffic as well; but at least the attack will stop so that you have time to figure out the target)
  • Address your network—there are a number of things that you can do fairly early on like rate limit your router to prevent your Web server being overwhelmed, add filters to tell your router to drop packets from obvious sources of attack, and timeout half-open connections more aggressively. Of course, in the face of a larger volumetric attack, these solutions will only buy you a little time.
  • Implement security—what you want to prevent in the future is the attack happening in the first place. That’s why you need a security solution sitting somewhere in front of your origin.

 

Let’s pretend that you are really under attack (while you are reading this blog). I think it’s pretty safe to say that shutting down your website isn’t the answer. So you look through those logs. Yup, page found. It’s your login page (the one with the captcha on it). You can rename the login page and inform users but it’s a pretty safe bet that the attack will happen again. Better yet, you remove the login page and put up a “we are currently experiencing technical difficulties” message to inform users that there’s a problem and that they can't login for a while. Good start.

 

But what you really need is security. And pronto.

 

Is all security created equal, though? Well, according to Dan Rayburn in a post on streamingmediablog.com, the short answer is “no.” DIY security, whether it’s pure CPE (customer premise equipment) or a combination of CPE and cloud, adds significant complexity to your content delivery architecture. As Dan indicates, security purely in the cloud makes a lot of sense for a number of reasons including mitigating the attack further upstream (at the edge of the network), absorption (the CDN is built to absorb lots of requests), and resiliency.

 

And that’s exactly what Limelight Networks DDoS Attack Interceptor, can provide you. Built into the CDN edge, this cloud-based security offers both attack detection and attack mitigation (via direct integration with traffic scrubbing centers) to help address the typical Layer 3 and Layer 4 attacks as well as those newer, sneaky Layer 7 attempts. It’s the CDN on security steroids. What do you get?

  • Attack notification—you don’t have to figure out yourself if you are under attack. Limelight will monitor your traffic through our NOC and notify you if we believe that you are being attacked. With the flip of a switch we can divert your traffic to scrubbing centers returning only the good traffic to your origin.
  • No infrastructure flooding—the biggest issue with handling a DDoS attack yourself is that both good and bad traffic can flood your routers, switches, and pipes potentially competing with other network traffic and causing all sorts of mayhem. But because Limelight’s detection happens at the edge, on our network, we can divert traffic before it ever reaches your origin ensuring that only the good traffic gets to you.
  • Longer lead time—attack mitigation is happening far away from your physical origin meaning there is a lot more time to not only address the attack (after you’ve been alerted that it’s happening) but to do something about it.
  • Traffic scrubbing—when an attack happens, you only need to make a simple DNS change to have your traffic pointed to one of our scrubbing centers where the good requests will be peeled away and delivered back to your origin. Once the attack is over, all you need to do is point your domain back to the CDN. Problem solved.

 

If you are under attack, and under-prepared, you can bet that your site will be down for more than a few minutes. But with our no-panic security solution, we can help you get it back up and running while also providing the peace-of-mind that your site will be protected against future attacks.

OTT is really nothing new. As consumers, we’ve had plenty of access to OTT content from a variety of different sources including traditional incumbents (i.e., Comcast and TimeWarner) and pureplay providers (i.e., Netflix and Hulu). In fact, one could say 2014 was a banner year with the likes of Comcast, DirecTV, Verizon, and others providing out-of-home live streaming for subscribers in addition to a growing on-demand library[1]. Of course, we still had the pure-play providers but we also saw a rash of OTT startups as well such as Dramafever. If the growth of providers in this space says anything, it’s that OTT is the future direction of television.

 

But 2014 was more than a year about incumbents and their live, linear broadcasts. It was also a year in which cord cutting became more than just buzz words—Dish Networks announced that their Sling Television would provide live access to a specific number of premium cable channels without a traditional cable subscription. And some content owners also threw their hat into the ring. CBS, HBO, and Nickelodeon all indicated they would provide direct subscription to their content, no longer only enabling OTT access for just those consumers with a cable subscription[2]. The value of services like Sling Television and direct-to-consumer content seem obvious. First, they empower the consumer with choice. Consumers are no longer encumbered with paying for channels via their MSO that they never watch. And second, these services challenge the status quo. Consumers can get the content they want anytime and anywhere direct via a stand-alone OTT offering. The MSOs and other providers no longer have the “keys to the kingdom.”

 

But almost a quarter of the way into 2015 and we’ve seen relatively little of the promises that were made. Where are the OTT services that will liberate us from the yoke of our MSO subscriptions allowing us to cut the cord once and for all? Of all those that have recently announced their intentions, only two have launched thus far—Dish Networks Sling Television and CBS All Access.

 

Dish Networks Sling TV

Screen Shot 2015-02-23 at 4.33.36 PM.png

Screen Shot 2015-02-23 at 4.34.00 PM.png

With Pop-up

 

Full-screen

 

Dish Network’s Sling TV is a solid offering. Capable of streaming what appears to be 720p content, it’s available as a web app, native PC/Mac software, and iOS/Android app. The service will also be available on a rash of devices and smart TVs. From the press release:

 

“Supported devices expected to include Amazon Fire TV, Amazon Fire TV Stick, Google’s Nexus Player, select LG Smart TVs, Roku players, Roku TV models, select Samsung Smart TVs, Xbox One from Microsoft, iOS, Android, Mac, PC”

 

The basic package starts at $20/month and includes a number of first-rate premium channels with additional channels available for a monthly upcharge[3].

 

Screen Shot 2015-02-23 at 4.30.24 PM.png

What Dish Network provides is a great linear-broadcast experience. Channel changing is easy. Built-in DVR functionality enables you to rewind easily as well as pause (for those channels that support such functionality; ESPN, for example, did not). And the overall quality of the stream is very high on a WiFi connection although I did notice an audio-syncing issue when watching a movie on the El Rey Network (Cobra—one of the worst Sylvester Stallone movies of all time). The biggest issue with the Dish Sling Television offering is that it’s only linear. There are no VoD offerings associated with the channels, which means if you miss your show…you miss your show. It's quite possible that the future may hold sVoD functionality but the lack of it now is a definite ding on the service.

 

But the cool thing? No extra advertising. That’s right, you only get the adverts that appear in the normal linear content just like you are watching normal television. There are no pop-ups or banners.

 

CBS All Access

IMG_0298.PNG

IMG_0299.PNG

Channel selector

 

On-demand episodes

 

CBS All Access is exactly what you’d expect—a library of all the CBS content that you can watch (which, when you click on the “Shows” link the menu is pretty darn amazing) for $5.99 per month. But it also provides access to local, linear broadcast CBS content, although this isn’t available in all areas (including Phoenix; I’m sure it’s probably a matter of negotiating rights with affiliate station owners) so I was unable to test it.

 

Overall, the offering is pretty solid. Watching shows is easy with simple controls for rewinding, fast forwarding, etc. Series are easy to find and navigating to previous seasons within the series is accomplished through selecting the season from a drop-down. Again, I can’t stress enough how much content is available through this OTT offering. It’s pretty impressive.

 

But the service does have its shortcomings. First, and foremost, is the lack of linear content in certain markets. That may definitely keep some consumers from subscribing. Second, there is no native app for the computer or other connected devices. That doesn’t mean they won’t launch them at a later date but, again, it’s limited in this early release and although watching through the Web is okay a native app provides a much better overall experience. Finally, the way that CBS All Access is monetizing their assets may definitely keep some consumers at bay.

Screen Shot 2015-02-24 at 5.13.52 PM.png

Consumers may find the multiple layers of advertising on the on-demand content a bit annoying (that’s a Red Lobster ad running pre-roll and a complimentary banner ad). Not only do you get commercials you can’t skip past, but you also get pop-ups and banners that are associated with the in-stream advertising (and a lot of the same advertising, by the way). There are also the same commercial pauses in the on-demand content as you’d find in the linear broadcast. They have made on-demand like watching a linear feed.

 

Still, the content library (with current and classic shows) is outstanding and the promise of linear broadcast is sure to provide a much more compelling offering in those markets where it’s available. Although some might argue that consumers may balk at $5.99 for what seems like regular television content (i.e., not premium content like HBO), for those with a vested interest in CBS content this truly provides an anytime, anywhere ability to watch their favorite series[4].

 

Conclusion

Dish Network’s Sling Television and CBS All Access represent the next wave in OTT—a combination of linear and on-demand content offered as a subscription directly to the consumer. Although it’s unclear when, it’s only a matter of time before we see others like HBO and Nickelodeon offering similar OTT experiences. The ultimate question is whether or not this will complicate content viewing. If everything moves to a-la-carte (if CBS is wildly successful, will NBC and ABC follow?) how will consumers manage the viewing experience? Right now, that’s what the MSO does through a visual, interactive guide. But when the consumer is burdened with 5, 10, or 20 different OTT offerings for which they have an individual subscription? Perhaps this is where fantastic smart TV console applications will come into play (I have an LG TV and use it to access Netflix, VuDu, and my home Plex server)? But, then again, those consoles are ultimately shackled to the TV and consumers clearly want mobility as part of their OTT experience.

 

Although it seems there is a clear path for the evolution of television (combined on-demand and linear broadcast OTT services) it also seems apparent that there is still a lot of work to be done to ensure that the overall experience of having multiple OTT services (I could easily see subscribing to Dish Networks Sling Television, CBS, HBO, and others when I drop my cable subscription) is great for the consumer.


 


[1] Most IPTV, satellite, or MSOs only provide streaming of content via in-home network; and even when they do provide out-of-home viewing, the channels are usually limited such as is the case with DirecTV (i.e., https://support.directv.com/app/answers/detail/a_id/3624) and Verizon.

[2] It was unclear if these offerings would be linear, SVoD, or a combination of the two although CBS “All Access” account seems to point to the later.

[3] If you do the math here, that’s approximately $1.33 per channel.

[4] CBS content is currently available on Hulu as well but CBS only makes some of the more recent, popular content available days after its aired; CBS All Access would provide immediate viewership of first-run content.

Well, the big game is behind us and now that the dust has settled, it might be worth a little recap on how everything fared.

 

First, if you didn’t know, NBC decided to stream the SuperBowl live to tablets and computers. That’s a big first but not that surprising. Given that the last three Olympics garnered massive online viewership and the FIFA 2014 was a huge success (2010 was no slouch either), it’s clear that NBC was simply following the trend—not all consumers want to be tied to their couch. Sure, television viewing dwarfs online consumption but it’s all a matter of choice. Not giving consumers choice in how and where they watch a live event like the SuperBowl puts the overall experience at risk.

 

Okay, now that we’ve got a baseline, let’s recap what happened:

  • 2.5 million people in all viewed the online event at some point. This is compared to an estimated 114.4 million viewership.
  • There was an average of 800,000 users per minute
  • 1.3m people viewed online at the moment Malcolm Butler made his game-changing interception
  • There were over 28.4m tweets during the game and halftime show
  • ESPN aired 346 hours of NFL programming during Super Bowl Week across its networks. People spent a total of 7 billion minutes viewing the coverage.

 

In short, a lot of online activity both during the game and leading up to it. But the experience wasn’t without its detractors. Will Oremus at Slate captures a couple of the most obvious:

  • Lag—some people reported up to a minute lag between the online version and the broadcast version which created for some interesting spoilers for those employing a second screen experience to interact with social networks
  • Ads—NBC sold its online ads separately from its television ads. While some advertisers bought both, many did not. According to Oremus, “however, NBC filled the holes in its commercial breaks by running the same terrible ads again and again—and again, and again, and again.”

 

So what does this all mean? To borrow a phrase from Malcolm Gladwell, we are on the verge of a couple of major tipping points:

  1. Cord cutting. Live events, especially sports, are the last holdout for enduring broadcast television. As more networks like NBC deliver their live event streams simultaneously via broadcast and IP, this remaining holdout on cutting the cord will be removed. Combine that with content owners going direct to consumer (HBO, CBS, Viacom, for example) and smartTVs with connected apps and it’s not hard to see a very near future where consumers watch what they want, when they want, and where they want all via the Internet.
  2. Consumer behavior. Although 2.5m million people viewing the Super Bowl is only 2% of the total viewership, the Super Bowl is not a good indication of this trend because of the social nature of the event. People congregate at homes and in bars to watch the game together. But for other live events, computers, tablets, smartphones, and smartTVs are increasingly becoming go-to devices because they offer convenience, control, and more interactive experiences. According to Statista, in fact, between 2011 and 2013 the number of people watching OTT content daily increased from 17% to 32%. Although the Super Bowl may not illustrate the trend, the shift is happening.

 

It’s not hard to predict that by, let’s say 2020, video content consumption will look markedly different. Although people will still use televisions, the large screen will only be part of a multi-screen strategy—just one way to consume content that is available across other devices as well. Imagine instead of 2.5m people viewing the Super Bowl online, it might be 12.5m people or 22.5 million people (especially as broadband penetration continues to grow). Although you may disagree with the predictions, just think about the demographics—the 15 year olds who have grown up in the digital space will be 20 years old (in 2012, those under 15 today made up about 20% of the U.S population). How will they want to watch the big game? The fact that NBC decided to provide the Super Bowl free of charge to tablets and PCs illustrates their recognition of these tipping points. These trends are not something to fight against. If the live streaming of the Super Bowl demonstrated anything it's that there's a tsunami of change coming that content owners and broadcasters must embrace if they expect to remain relevant.

 

What's next? I think we will see more live events streamed for free (as ad parity gets more mature) and more content come directly to the consumer. It will only be a matter of time before someone creates a service to aggregate all the "channels" to which a user might subscribe. In essence enabling consumers to create their own customized TV playlists of VOD and live content.

 

Whatever does happen, though, it's clear that should be preparing ourselves for a glorious ride downhill.

 

 

Sources:

  1. http://www.bloggingtips.com/2015/02/05/superbowl-big-advertisers-won-social-media/
  2. http://mesalliance.org/blog/uncategorized/2015/02/02/nbc-super-bowl-live-stream-sets-records-and-frustrates/
  3. http://money.cnn.com/2015/02/02/media/super-bowl-streaming-record/index.html?iid=SF_T_River
  4. http://www.slate.com/blogs/future_tense/2015/02/02/nbc_sports_super_bowl_live_stream_problems_delays_commercials_ruin_online.html
  5. http://espnmediazone.com/us/press-releases/2015/02/espn-delivers-super-audience-super-bowl-week/
  6. http://kff.org/global-indicator/population-under-age-15/ 
jthibeault

TV 2.0

Posted by jthibeault Jan 23, 2015

We all know the typical television experience—you plop down on the couch, pick up your remote, hit the power button, and whamo, the TV comes on and content starts playing.

 

But that tried-and-true television experience is undergoing a massive evolutionary change thanks to a growing host of upstart technology companies bent on changing the way viewers consume video content. Over the past decade companies like Netflix, Amazon, Hulu, YouTube, and M-GO have challenged incumbents around the distribution and consumption of video.

 

But the incumbents aren’t taking it lying down. Many of the biggest operators in the U.S. are fighting back against both that rising tide of technology-based startups as well as other network operators (i.e., Verizon, CenturyLink) jumping into the content game. The result? Viewers have become empowered and that traditional television experience is being tossed out the window.

 

What’s driving this transformation? I’ve identified three key trends underpinning this radical shift in how we watch television:

  • IP delivery and anywhere access
  • Direct to consumer
  • Apps and interactivity

 

IP Delivery and anywhere access

The traditional television experience is constrained. First by the fact that it is contained to a single device (i.e., the TV). And second by the way it is distributed—over wires, through the air, or via satellite. Both of these inherently limit where the signal can be consumed and by whom. But the new entrants to the television market don’t rely on traditional delivery methods. Instead, they leverage a growing, ubiquitous phenomenon. The Internet. Through this global, massive IP-based network, upstart media companies are able to distribute content more cheaply and to more places (including the television). And that has appealed to a consumer base that is increasingly connected to the Internet—they want to access the content to which they subscribe from wherever they are, on whatever device. FIFA 2014. The Olympics. Downton Abbey. All of this content shares one thing in common—it’s freed from location. Viewers can consume it from any device, anywhere in the world. They are no longer tied to their couches, their family rooms, and especially a schedule of times dictated by broadcasters. IP delivery enables the opportunity of anytime, anywhere access across a host of different devices. The upstart challengers to the television status quo have relied on this just as consumer behavior is shifting to multi-device viewing (check our Google’s Multi-Screen World study for a snapshot of this behavior change). But the incumbents aren't sitting idly by. They see the shift from to IP and are embracing it. Comcast, TimeWarner, Cox, DirecTV, Dish. All of them offer an IP-based OTT offering for subscribers to access content on their different devices (what’s more, many content owners and broadcasters are using IP to deliver their content to television stations and other distribution hubs; it’s a lot cheaper than satellite). IP and multi-device viewing have truly liberated traditional linear television content from the confines of the family room.

 

Direct to consumer content

Consumers are gaining more choices in how and where they source their content, which is right in the wheelhouse of these new entrants to the television market. Companies like Amazon and M-GO are building their content libraries and offerings on a direct-to-consumer model just as incumbent broadcasters are exploring the opportunity as well— HBO GO is delivering content directly to consumers; CBS is doing the same; and Dish Network is cobbling together a streaming package for cord cutters. But none of this would make any sense if audiences were tied to a specific location and a specific time. IP delivery coupled with anywhere access and combined with the power of web-based applications provides content owners the ability to offer their wares directly to consumers. Building a subscription-based website can seemingly happen in a few weeks and content protection (like Widevine, UltraViolet, Adobe Access, and Microsoft SmoothStreaming) has continued to become harder to crack and more accepted by consumers. Not only does going direct change the way consumers watch, but it also changes the relationship that consumers have to the content providers. Whatever the end result, it’s clear that unshackling the content from the broadcaster is what consumers want and what content owners are starting to provide.

 

Apps and interactivity

In the traditional television experience, content consumption is all one way. Sure there have been attempts at providing bi-directional interaction through the remote control but they were kludged together at best. With the shift from traditional broadcast to IP, content can now be integrated into application experiences enabling content owners and distributes to wrap interactivity (and monetization) around their content. For example, rich application experiences can include suggested or recommended content, social media integration (for direct engagement with fellow viewers), and even opportunities to purchase. Think image or face recognition and enabling viewers to click on items in a video for more information…or to purchase. Apps can open an explosion of big data around video content enabling entirely new features and functionality that aren’t available with traditional broadcast. But the upstarts aren’t just bringing these apps to the market. Incumbents and new entrants alike are launching software across the entire ecosystem of devices—phones, tablets, PCs, Web, smart TVs, STBs, and even game consoles. Apps are becoming the new gateway to how consumers engage with content. A generation of consumers growing up with this new television experience doesn’t want to just watch the content. They want to interact with it…and that can only happen with content delivered through an application.

 

Say “hello” to TV 2.0

Whatever the final outcome, one thing is clear—the rash of technology companies jumping into the traditional television landscape, and the incumbents who are fighting back, have left it irrevocably changed. Call it TV 2.0, a new television experience in which viewers can not only access traditional TV content from any device, anywhere in the world, but do so in a more interactive, immersive manner through engaging applications. But it’s not all rainbows and unicorns. As more content gets pushed via IP, we put increasing strains on the Internet. In order for both incumbent and upstart to provide the kind of “broadcast quality” experience that consumers expect from this content (regardless of where or how it is delivered), they will need a bulletproof method of delivery, something that can fly past the growing congestion on the Internet and ensure that content is not only delivered to wherever consumers are, but flawlessly as well.

 

What does the future really look like? Will we really have a-la-carte content offerings and an application to manage our subscriptions? Will consumers be able to subscribe to just the content they want? Only time will tell as continued innovation driven by new entrants to the market continues to upend the traditional television experience. But one thing is clear—we are in a brave, new world for how we watch TV and there’s no going back.

Originally published July 29, 2014

 

As digital becomes more important in how organizations connect with their prospects and customers, it behooves us from time-to-time to take a barometer reading of user expectations and perceptions. That’s exactly what we did in our annual study, The State of the User Experience.

Based on a survey of over 1,000 end users, our report finds some startling and interesting conclusions:

 

  1. Performance is the most important expectation for digital experience and can directly affect revenue
  2. Mobile devices are becoming the primary web access point for consumers, who now expect similar performance from mobile devices and desktop browsers
  3. The value of web experience personalization remains to be seen

 

This infographic captures some of the data points behind the conclusions. Download the report today and learn more how users really feel about your digital experiences.

 

 

Originally published October 22, 2014

 

internet_bandwidth-600x400.gif                                           

End-user bandwidth is continuing to increase. Whether it’s the migration from 3G to 4G to 5G on mobile devices, improved WiFi, or fiber to the home, people around the globe are increasingly finding themselves with more bandwidth. In fact, according to Akamai’s most recent State of the Internet Report, global broadband speeds are up 24%!

 

But there’s a problem with that.

 

Just as consumers are getting faster speeds to access digital experiences, the organizations providing those experiences may consider letting their foot off the proverbial gas—they may assume that more user bandwidth will improve the performance of those digital experiences (i.e., websites, online games, software downloads). Only that isn’t exactly the case. According to this study by Mike Belshe[1], “if users double their bandwidth without reducing their Round Trip Time (RTT), the effect on Web browsing will be a minimal improvement (approximately 5%). However, decreasing RTT, regardless of current bandwidth always helps make web browsing faster.” The study demonstrated, through varying bandwidth from 1mbps to 10mbps, that Page Load Time (PLT) saw diminishing returns as the bandwidth got higher (Table 1).

 

Bandwidth (in Mb/s)Page Load Time via HTTP
13106
21950
31632
41496
51443
61406
71388
81379
91368
101360

(Table 1)

 

Although there is a considerable jump in the early bandwidth speed increases, the returns as the pipe gets bigger continue to diminish until they are almost negligible (i.e., from 9mbps to 10mbps).

 

Bandwidth, it would seem, isn’t the answer to web performance woes! That’s because even though the pipe between the end-user and the Internet is bigger doesn’t mean that a website or web application will really download any faster. The measure of how well a digital experience performs isn’t just about the speed of download. It’s how quickly the first byte is accessed, how many round trips it takes to return the data, how fast the digital experience is rendered in the browser (i.e., time to paint), and more…all of which depend upon a variety of factors that are well outside the purview of broadband speed. A poorly tuned web origin will still have inherent latency issues whether it’s being accessed through a low bandwidth Edge or a super-high 100mbps fiber connection!

 

Sure, you can mitigate some of your Page Load Time latency when users have bigger connections (although if they are already on a high-speed connection, as Mike’s study points out, this will be severely diminished), but you can’t remove the rest of the latency. Consider the following:

 

  1. Your website loads in 10s on a 2mb/s connection.
  2. Using Table 1, users gaining access to your website through a 10mb/s connection would feasibly cut your Page Load Time to 7s (~30% download improvement)
  3. That still leaves your website at 7s. Let’s say that your end-user connections improve to 20mb/s. Using Table 1 again, you might see further improvement of approximately 5% feasibly cutting your Page Load Time to ~6.7s.

 

That’s still 6.7s of Page Load Time that cannot be mitigated through faster bandwidth. So even as users are accessing your digital experiences through bigger pipes, your website may still be performing subpar. And as we discovered in our own State of the User Experience report, almost 60% of users will abandon a website if it takes longer than five seconds to load! Uh oh.At some point, this latency must be addressed through alternative means such as reducing Round Trip Time and reducing the number of Round Trips. In fact, according to Mike’s study, decreasing Round Trip Time (regardless as to what you reduce it to) has a steady, scaled effect on reducing Page Load Time (Figure 1).http://blog.limelight.com/wp-content/uploads/2014/10/Screen-Shot-2014-10-21-at-3.24.53-PM.png

 

Screen-Shot-2014-10-21-at-3_24_53-PM.png

(Figure 1)

 

What then should you focus on? What can you do?

 

  1. First, do what you can to reduce the number of Round Trips it takes to retrieve your website. The easiest solution is to enable persistent connections so that more data can be delivered in a fewer number of connections. Each time an end-user request has to return to your origin (or even cache), latency is added to the transaction journey.
  2. Second, reduce the Round Trip Time. This is inherently harder, especially if you are using the Public Internet. Get off the Internet! You need to use a CDN so that you can take advantage of objects being in cache and very close to the end user, significantly reducing the Middle-Mile journey and thereby making for a shorter Round Trip Time.
  3. Third, tune the server. Make sure that your webserver is properly tuned and doing only one thing—serving your website. Lots of organizations will serve websites on generic “all purpose” boxes that also house other applications like databases. The extra computational load will slow down your webserver’s ability to do what it needs to do…return content to a user’s request.
  4. Fourth, turn on the cache. Whether or not you are using a CDN, you should enable caching on your webserver. This will help reduce the Round Trip Time by fetching popular content quicker (especially when it’s fetched out of memory, i.e., memcache, rather than from the hard disk).

 

Of course, there are lots of other ways to improve website experiences such as optimizing images and compressing text files like JavaScript. All of which will help reduce the number of Round Trips and the Round Trip Time. Whatever you decide to do, just don’t sit back and do nothing, hoping that as users get faster connections your website will perform better.

 

Because in the long run…it won’t.

 

[1] Belshe, Mike. “More Bandwidth Doesn’t Matter (Much)” April 8, 2010. https://docs.google.com/a/chromium.org/viewer?a=v&pid=sites&srcid=Y2hyb21pdW0ub3JnfGRldnxneDoxMzcyOWI1N2I4YzI3NzE2

Image courtesy of www.etny.net.

Originally published October 16, 2014

 

MiniaturePoodleLucaPurebredDogApricotColorTan2

 

On Tuesday, October 14, 2014, Google researchers announced the discovery of a vulnerability that affects systems with SSL 3.0 enabled. This vulnerability has been named POODLE (Padding Oracle On Downgraded Legacy Encryption). Details are available at https://www.openssl.org/~bodo/ssl-poodle.pdf

To mitigate exposure to this vulnerability, it is recommended that the use of SSL 3.0 be avoided. SSL 3.0 is an outdated standard, but is still in use in support of legacy applications.

 

Because of the need to support legacy systems, the elimination of SSL 3.0 may not be practical. In that case, customers must weigh the need to support the older standard against the threat of security vulnerability.

 

Limelight strongly encourages discontinuing the use of SSL 3.0, and is actively working with customers to implement the mitigation, while minimizing disruption to their end users. While we strongly believe customers should discontinue use of SSL V3, we have chosen to work with customers to help them mitigate this vulnerability, rather than shutting down SSL V3 support across the board, which might have the unintended consequence of disrupting customer’s businesses. However, we have published proactive notification, informing customers they continue running SSL V3 at their own risk.

 

Regarding the potential use of a workaround known as TLS_FALLBACK_SCSV, Limelight’s position is that this particular mitigation may not fully address the vulnerability. We believe the only acceptable method to fully address the vulnerability is to discontinue the use of SSL v3. More information on TLS_FALLBACK_SCSV is available at https://tools.ietf.org/html/draft-ietf-tls-downgrade-scsv-00

 

If a customer requests, Limelight can block/eliminate SSL 3.0 on their behalf. Again, we will do this FOR customers, not TO them as some other providers have chosen to do.

 

Customers requiring Limelight to mitigate SSL V3.0 on their behalf are encouraged to call/email Limelight support through the normal support process. Because mitigating SSL v3.0 may affect the customer’s end users, it is imperative that the request to shut it off be generated by someone who is authorized to speak for the customer, and who has considered the potential service issues.

 

We have a process in place to perform this mitigation quickly for customers, and believe typical mitigations should be completed within 72 hours of the request.

As always, the security and availability of your services is our highest priority.

 

Please direct further questions to Support@llnw.com

 

Image courtesy of www.dogbreedinfo.com.

jthibeault

From IBC 2014: Wrap-up

Posted by jthibeault Dec 8, 2014

Originally published September 18, 2014

 

Well, another IBC is in the books. Thousands of business cards exchanged, tens of miles walked, and gallons of coffee consumed. But what was it all for? Besides the obvious answer of “drum up more business,” IBC serves a critical function in the media and broadcast business—to reveal, challenge, and validate the trends that are shaping the industry.

 

So what did this year’s show reveal?

 

  • OTT—clearly, OTT has come of age. The floor was dotted with OTT providers like SatLink Communications, Viaccess-Orca, and Zappware all pitching end-to-end software for carrier and broadcast-grade OTT solutions. It would seem that OTT is rapidly becoming the “gateway” for how consumers discover, consume, and playback content regardless of the source.
  • Mobilization of Live—content acquisition of live events used to be handled solely by trucks, cables, and satellites. But a number of new technologies such as LiveU and Quicklink (IEC Telecom) provide backpack-based or other mobility camera solutions to enable truckless acquisition of content in the field.
  • Cloud-based Live Production—broadcast has traditionally been carried out at specialized “control rooms.” But as everything in that control room gets connected via IP, there’s no reason why production and broadcasting can’t happen through cloud-based services like Make.tv, which offers a complete virtualized production studio.
  • Complete workflows—the entire broadcasting paradigm is getting turned on its head as a result of intelligent and powerful software that liberates content publishers from their traditional workflows. These cloud-based solutions, like AVID Everywhere and Mediagenix cover everything from content production to distribution.
  • Programmatic Advertising—obviously, monetization is on the tip of everyone’s tongue. Companies like Civolution are taking it to the next level by synchronizing TV-based advertising with media buying opportunities on social networks and the Web.

 

But IBC 2014 wasn’t just about technology trends. There were also fundamental themes woven throughout the show:

 

  • Legitimizing the cloud—not only are cloud-based services springing up to replace incumbent broadcaster technologies and processes, but more and more elements of the broadcaster workflow are becoming connected to the cloud to provide more accessibility and greater flexibility.
  • IP broadcasting—despite the usual prevalence of satellite vendors at the show, IP is making a deeper and more meaningful push into the broadcast industry with the ultimate goal of enabling the delivery of core content (i.e., broadcast television) to any device, anywhere in the world.
  • Broadcasting without boundariesas I wrote about in a previous post, many of the technology trends were about liberating broadcasting from the traditional, legacy processes, hardware, and technologies so that the acquisition, production, and delivery of content can happen anywhere, not just in a control room.

 

Deep-down, though, IBC 2014 was all about showing that things were starting to work now. The media and broadcast industries are inundated with new technologies all the time. So much so that we forget the bigger problem: making them all work together. Cloud services. Software-based encoding. Workflow solutions. All of them sound good but when they are bright and shiny they are also unproven. And there’s a lot of risk in incorporating unproven technology into production environments. IBC 2014 seemed to forgo the bright-and-shiny for some tried-and-true.

 

That’s it from the show. Doe-doei until next year (that means bye-bye in Dutch…I think).

Originally published September 15, 2014

 

Clearly broadcasters have a challenge: get their content to end-users as quickly as possible in the most efficient manner. But that problem is obviously exacerbated by the proliferation of devices. Consumers are no longer chained to their desks, chairs, or couches. So those same broadcasters who have for so long distributed their content via closed, terrestrial networks are now facing the uphill battle of extending their infrastructures, workflows, and processes to push all that content over IP.

 

Which brings me to IBC 2014.

 

With the changing landscape of content consumption, it’s clear that broadcasters must evolve the way they publish. Not only must they deliver content to all those devices, but they must also do so quickly and efficiently. If you listen to Avid, a staple of many broadcaster publishing workflows, it’s because of an “accelerated digitization of the media value chain.” Countless elements of the content publishing workflow are now being offered as cloud-based services enabling broadcasters and media companies to literally publish from anywhere. In fact, Avid announced AvidEverywhere, “the most fluid end-to-end, distributed media production environment in the industry, a comprehensive ecosystem that encompasses every aspect of the new digital media value chain.” Only Avid isn’t alone. AP, Ericsson, Verizon Digital Media Services, Microsoft Azure Media Services, and others have cloud-based platforms for media publishing as well.

 

What’s special about these services is that they promise to liberate content publishing just as multi-device has liberated content consumption. Broadcasters and media organizations are no longer tied to expensive equipment or desktop-based software. Through cloud-based services they are empowered to acquire, edit, and publish content from anywhere they can access the Internet.

 

But it’s only half the story. Creating the best content in the world doesn’t matter if you can’t get it to the viewers on whatever device they want to use, wherever in the world they are. And just as the IBC floor showcased some of these new, innovative services to publish content it also addressed the other side of the workflow: delivery. Quickly and efficiently getting content to end users means being able to convert content into the necessary formats as well as being able to distribute it to multiple end points over an increasingly congested Internet. The Limelight Orchestrate for Media and Broadcasters solution tackles that exact problem: transforming and delivering content in the cloud.

 

It’s broadcasting without boundaries.

 

Together, these new cloud-based media production workflows coupled with delivery workflow services replicate what were once closed software/hardware ecosystems for content creation, publishing, and distribution. This enables content publishers to not only create what they want and where they want, but also to distribute to consumers so they can watch when they want, how they want, and where they want.

 

Of course, the system still isn’t perfect. There remains a tremendous amount of integration that needs to happen between these two different parts of the workflow so that the entire process is seamless for the content publisher. But IBC 2014 showed us a glimmer of a future in which content publishing becomes fluid, when nothing gets in the way of getting great content to end consumers.

Originally published August 29, 2014

 

Wake up, folks. ZRT is here.

 

Facebook Zero. Wikipedia Zero. Google Freezone. All these initiatives share one thing in common—their traffic doesn’t cause the mobile subscriber to rack up usage charges against their data plan.

 

But it was the most recent entry into the zero-rating game, T-Mobile, that truly demonstrated how powerful and real zero-rating has become as a marketing tool. Early this year, T-Mobile launched the “Music Freedom” platform. This initiative enables subscribers to listen to as much audio as they want from their favorite audio streaming provider without burning through their monthly T-Mobile data allowance. At the time of launch, only 8 providers[1] were enabled although T-Mobile is now crowd-sourcing requests for future services to be added.

 

It is a great marketing campaign, targeting a specific, high-profit user segment. With its launch, T-Mobile opened the door to leveraging zero-rated traffic as a marketing vehicle, and that is some powerful mojo.

 

Why audio? Because it is easy. Even high quality audio at 128kbps makes it difficult for a large audience of simultaneous subscribers to suck up tons of bandwidth. But what’s next? A “Video Freedom” platform? Who knows…

 

Regardless of how ZRT as a feature plays out, it should be a major wake up call to every content provider, because delivering zero-rated traffic is different. The key is that this kind of traffic is tied into the carrier’s billing system. The carrier “whitelists” one or more IP addresses, allowing the content from those IP addresses to be zero rated when it’s consumed by the user. Meaning, there’s no charge against the users data allotment.

 

And when a content owner’s traffic is zero-rated, the impact can be dramatic. Suddenly, there are a lot more people wanting to consume that content. And, all that expanded traffic may well require the use of a CDN to deliver the scale and global reach necessary to satisfy user demands.

 

If you are considering ZRT, you will want to engage a CDN anyway. CDNs can provide the reserved IPs that the carrier needs to make ZRT work with their billing system. Be careful, though, because not all CDNs are created equal when it comes to reserved IPs. Most can only deliver them from a few specific points within their network. That kind of restriction forces a tradeoff between performance and capacity that obviates the benefit of using a CDN in the first place.

 

But this isn’t the case with Limelight Networks. Our network was architected to allow for reserved IPs at scale. That means when you reserve an IP address in our network, it is reserved in every POP, everywhere in the world. When you are using a CDN provider that can provide reserved, virtual IPs at scale, you can deliver zero rated traffic through virtually any mobile carrier with which you can sign a contract.

 

In addition to requiring reserved, dedicated IPs for whitelisting traffic, some zero-rated traffic also needs to be delivered over Secure Sockets Layer (SSL). This is especially true for content such as device updates but it is applicable for any content owner that fears their content may be ripped off in transit.

 

The issue here is similar to the dedicated IP issue. Many CDNs serve HTTPs (SSL) traffic only out of separate delivery pools that are significantly smaller and less geographically dispersed than their HTTP equivalent. Again, this forces unnecessary capacity and performance tradeoffs. But it’s not the case with Limelight Networks.

 

In early 2014, Limelight combined delivery pools enabling us to deliver secure HTTPS traffic at the same global scale as we traditionally deliver HTTP.

Although you may not be running out to your local mobile carrier to strike a zero-rated traffic deal tomorrow, T-Mobile’s initiative has set the bar for content owners. If you are considering ZRT, or have other needs for reserved IPs at scale, perhaps the best choice is to partner with a global CDN that’s already future proof. A CDN that doesn’t force you to make tradeoffs between scale, availability, and performance just to get your content delivered the way you need it. A CDN like Limelight Networks.

 

[1] http://www.t-mobile.com/offer/free-music-streaming.html

Image courtesy of e27.co.

Originally published June 30, 2014

 

GOOOOOAAAAALLLLLLLLL!!!!

 

world-cup-2014-600x337

 

It’s a battle cry that’s resonating not just from television sets around the world but from smartphones and tablets as well. More and more people are turning online to get their World Cup fix while they are standing in line, waiting at a traffic light, or huddling over their laptop at their desk. What started with the 2012 Summer Olympics has only gained momentum—more people watching more video online. The World Cup, though, is blowing the Olympics out of the proverbial water! A global effort of multiple broadcasters and CDNs is delivering hundreds terabytes of data each match to people around the world.

 

So what’s really happening? As of the posting of this blog, our network has seen massive utilization peaking at well over 3.6 Tb/s (terabits per second) during match play. Watching from their iOS, Android devices, and game consoles, thousands of concurrent users are tuning in online. In fact during the United States vs. Germany final qualifying game, over 750,000 users connected at the same time for a flawless game-time experience. And one of our partners single-handedly hit close to 1 Tb/s as the U.S. advanced into the quarterfinals!

 

Of course, this doesn’t just happen by itself. In fact, when you pull back the curtain, what you get is a massive engine of people and technology dedicated each match to ensuring the best possible end-user experiences. So what does it take exactly to deliver a World Cup match online? Check out the numbers:

 

  • Tens of thousands of servers to accept connections from end-user devices and deliver the video to them all around the globe
  • Hundreds of man hours of dedicated engineering and support resources during each match (these are literally people sitting behind monitors in our network operations center)
  • Software to capture analytics and provide real-time feedback on who’s watching what, when, where, and how.

 

What makes it all possible? That would be the Limelight Network—a massive global private network supporting over 11tbps of egress capacity with 80+ locations in over 40 countries. It’s the private nature of the network that marks it from competitors and enables us to deliver flawlessly, for example, to 750,000 concurrent users. No Internet congestion with which to contend!

 

Of course, we are only still at the beginning of the World Cup. With the round of 16 just underway, the elimination matches promise to yield even more traffic and concurrent users. And that’s the really telling story behind this year’s World Cup: it’s a game changer for the way we consume media. It’s the herald of a snowball rolling downhill that threatens to transform the landscape of rich media. But it also signals something else—the need for more capacity, more software, and more expertise to handle the World Cup of the future…something that we are tirelessly focused on providing to customers around the world.

Originally published May 22, 2014

                                                                                                      

Screen Shot 2014-05-22 at 2.06.15 PM

This year’s 1-day show about content delivery and performance was all about the end-user. Quality of Service (QoS) and Quality of Experience (QoE) took center stage as real-user monitoring (RUM) and transparent caching seemed to be on everybody’s lips.

 

When it came to RUM, there was no better presentation than the session featuring Dan Rayburn (EVP, Streaming Media) and Pete Mastin (Market Strategy and Product Evanglish for Cedexis). This presentation on best practices in multi-CDN delivery stressed the value of real user monitoring (RUM) data in improving quality of service. For companies seeking to segment traffic based on end user performance, the Cedexis Radar community provides crowd-sourced data from 350 million global end users per day, and is considered one of the most accurate sources of CDN performance data on the market.

 

Our own performance benchmarking efforts validate these conclusions. After evaluating our dynamic content acceleration with internal and external synthetic testing, (http://resources.limelight.com/rs/limelight/images/ESGLabValidation-LLNWOrchesteratePerformance.pdf) we looked to Cedexis Radar data for validation. The results confirmed that our performance exceeded the competition by an average of 15%. [http://blog.limelight.com/2014/02/real-user-data-reveals-startling-results-on-the-performance-of-dynamic-site-acceleration-services-limelight-is-1/]

 

The takeaway here? RUM data provides more transparency to CDN customers that want to optimize performance based on real-world KPIs beyond raw network speed. For more on RUM, read our recent post “Real Users: A Common Web Performance Blind Spot.”

 

As for transparent caching, it all comes down to QoS and QoE. In the quest to optimize Quality of Experience, content providers increasingly depend on transparent caching to accelerate streaming and download of their popular content by placing it within user access networks. That means they might put a transparent caching box in Time Warner’s network, for example. On the provider side, Qwilt and PeerApp presented about the benefits of transparent caching. And both Netflix and Google promoted their caches being placed within operator networks to give users a better experience. Of course, this makes total sense—the closer you can get the content to the end user, the better the experience should be. And if you got content any closer to end users than transparent caching, you’d probably put a server in their lap!

 

Google and Netflix are large traffic pushers today, but we don’t know what the world will look like tomorrow or even more so in two years. (Just look at how fast Twitch emerged as a major player in live streaming.) Other Content providers are looking to content delivery networks (CDNs) to implement the transparent caching strategies that will lend a competitive advantage as demand for their content grows.

 

Regardless of the content source, transparent caching was generally regarded by show attendees as a triple win. Network operators increase capacity with drastically less capex, and can control the devices placed in their network for better design, monitoring, and management practices. Content providers control the quality of experience they provide over the short haul. End users, of course, get what they want: better video streams.

 

What did you hear at the show? How do you think these strategies impact QoS and QoE? Drop me a note at jt@llnw.com, comment below, or catch me on twitter @_jasonthibeault.