Skip navigation
All Places > In the Limelight Blog

This week kicks off the most exciting time of the year, for college basketball fans, with the 2017 NCAA men’s Division 1 tournament set to swing into action. As in previous years, new options to watch games, including the ability tune into more games, are part of broadcaster’s efforts to step up with innovative ways to satisfy viewers. However it challenges video infrastructures.

 

Following in the path of last year’s tournament coverage, the popular March Madness app delivers fan friendly features, including streaming every game for free. CBS and Turner will provide live streaming access across a record 15 different platforms, including Fire, Roku, Xbox, Apple TV, and Amazon’s Alexa. For a complete guide to game streaming options, read this c|net article. Access to sports and other live events is a significant concern for Millennial males, with 20 percent saying they would not cut the cord until more live content becomes available online, compared to just eight percent of the population as a whole. An important addition to online viewing options for sports is ESPN’s streaming support. On the broadcast TV side – TBS, CBS, TNT and truTV provide exclusive live, national coverage of all 67 NCAA Tournament games across the 4 networks. All these options mean viewers can watch more games than any previous March Madness. Instead of one game every four hours, you can even catch the games of smaller less known college teams on one of the many streaming options.

 

Alexa, what’s the score of the UConn game?

Amazon Alexa will answer questions related to scores and games. Alexa users also have the option of listening to the radio broadcasts of the games on the device. Westwood One, the largest American radio network, holds radio play-by-play rights for the men’s NCAA tournament, and will broadcast all 67 games on more than 500 affiliated stations, SiriusXM satellite, streaming online, and on mobile platforms.

 

All of these coverage options across media platforms follow a pattern established in the earliest days of radio, that sports broadcasting is traditionally at the forefront of media technology advancement, with high profile sports events being the first programming to use new capabilities. With March Madness, we’re seeing coverage expansion to more streaming channels than ever before, including use of the Alexa voice assist device. As 2017 rolls out, expect to see other new media formats used to give sports audiences a “just like being there” experience. Content delivery networks (CDNs) are poised and ready to deliver the video and audio streaming, so bring it on. Let the games begin.

Reduce the Cost and Complexity of Your Content Delivery Workflows

 

We’re in an on-demand world. Video on demand, streaming audio, web surfing, online shopping, gaming and more – today’s digital audience wants the world on their schedule. They use a wide variety of laptops, tablets, phones, and TV-connected devices, sometimes in conjunction. Tens of billions of hours of content per month are consumed globally. And consumers want a great experience – instant response, high quality, no glitches or delays or interruptions.

 

Collectively, it’s a massive on-demand beast that must be fed.


If you’re a company delivering content, what does it mean to feed that on-demand beast?


The Delivery Network

You want to create a great experience for your audience. You’ve probably learned that the open Internet fails to deliver at scale. As a result, you may already use a Content Delivery Network (CDN), to accelerate delivery to your audience. One of the important advantages of a CDN is that it uses edge cache to temporarily store many copies of popular content at many sites around the edges of the network.


Feeding the Delivery Network

But what if someone requests on-demand content that isn’t stored in the edge cache? In fact, this happens millions of times per day. Edge cache generally contains popular on-demand content – hit movies and videos, new software downloads, new hit songs, popular items on web stores, images and animations from top web pages, and so on.


In practice, the majority of a typical company’s content library isn’t stored in the edge cache. It’s stored on one or more file servers or video servers – origin servers. When a user requests an asset that isn’t in the edge cache, the network needs to retrieve it from origin storage.


The DINAS of Feeding the Delivery Network

How quickly and reliably can you fetch content from origin storage? It depends. Here are some factors that can make a dramatic difference in user experience – I call them the DINAS:

 

1. Distance from the user 
The earth is a big place. If your origin storage is halfway around world from the user, one-way delay could be 300 milliseconds or more – each way. That means the user waits over a half-second just for request and retrieval from origin. You can deliver a much better response time if you have multiple origins in multiple regions, located close to where your users are.


2. Integration with the delivery network
Traffic flow is hampered if it has to traverse different networks, locations and links between the origin and the delivery network. You can streamline and accelerate traffic flow if the origin storage is tightly integrated with the delivery network, especially if it’s collocated in the same sites as edge devices.

 

3. Network from the origin to the user
The open internet has many experience-impacting obstacles. It’s subject to link congestion, node congestion, rerouting, timeouts, retransmissions, and other issues. And of course, the farther your content must travel over the open Internet, the more issues it can face. The good news is you can bypass this with a high-bandwidth private fiber network.

 

4. Availability
There’s simply no way to deliver content if it isn’t available in the first place. Outages can be caused by hard drive failures, network device issues, local power failures, regional network issues, general network congestion, and more. You can ensure availability with redundant origin servers in multiple sites. That way, even if there are issues in one origin site or one network region, the network can fetch the asset from somewhere else.

 

5. Software intelligence
If you have intelligent software that can automatically choose the fastest origin for each user session, you’ll consistently deliver the best possible response.


Feeding the Feeder

Uploading content into your origin storage, and replicating it for multiple regions, can cost significant resources, time, effort, and money. If you have to do numerous manual operations, or develop custom code, or spend time on asset tracking and management, you’ll be inefficient and burn up your money, and burn out your people.


Extra fees are another way to burn through budgets. Low cost storage-at-rest can be tempting, but extra fees can dwarf the cost of basic storage. Depending on your vendor, extra hidden costs can include fees for uploading content, copying it to multiple sites, moving it, retrieving it, transferring it to your delivery network, even requesting a directory list.

 

The Bottom Line

Whether you’re delivering video-on-demand, over-the-top (OTT) video, file distribution, gaming, web acceleration, e-commerce, or other content, your origin storage is more important than you may realize.


The optimal choice for origin storage is likely to be designed specifically for content delivery. Flexible workflow will make your job easier. Automated replication to multiple sites ensures delivery wherever your users are located. High speed design and integration with the delivery network will give your users fast response and consistently high quality. Intelligent software will serve content from the fastest site so you delight your audience on a global scale. And of course, you’ll want to deliver great user experience on budget, without hidden fees.


And that’s how you feed the on-demand beast.

This is the second blog in the ARC Light series showcasing the capabilities of Computed Edge Policies. The introduction blog presented an overview of ARC Light, and included a few examples of use cases built around geolocation data. The second blog covered Origin Selection, and this blog will discuss user request modification.

 

Multinational companies inevitably have to solve for language support on their front end. There are many facets to the issue. From translating content to UI/UX work to account for the translated content. It’s a project that requires a lot of time and money. How will the front end know when to serve the appropriate language to the user? It may be an easy question to answer for existing users, but for new visitors, it's a different story.

 

Language Selection by Geographical Location

 

Language selection becomes easy when out-of-the-box CDN functionality is extended by ARC Light. The Limelight CDN provides geographical location information on every request. ARC Light extends the ability of the CDN to modify request headers in real time. A query string or header information can be automatically added to the request so the front end knows which language to display to the user. The CDN maintains the Geo database for actionable data. ARC Light manages the compute layer for modifying the request.

 

Advanced Request Configuration

 

This is a basic example of a request modification using ARC Light. ARC Light has the ability to evaluate any data available within the request header such as URLs, query strings, hostnames, cookies, IP addresses, or custom header data. Complex uses can then be solved within the Advanced Request Configuration to modify any portion of the header using predefined business logic. Simple, Quick, Compute at the edge.

 

More to Come

 

Over the next couple of weeks, several more blogs will be posted highlighting a specific ARC Light use case. Return to this site to read about more ARC Light capabilities.

Live video streaming offers content distributors, broadcasters and other organisations a golden opportunity to excite and engage actively with their audiences. With live video streaming gaining steam from the likes of Facebook and Twitter, integrating this into your strategy is key to staying relevant and remaining competitive. However, audience numbers and behaviour are notoriously difficult to predict. The diverse range of devices make it more challenging than ever to deliver the glitch-free experience that today’s consumers expect. If traffic levels exceed the server’s capacity, you risk providing a sub-optimal live streaming experience.

 

On the specific challenge of implementing live streaming video, Steve Miller-Jones, Senior Director of Product Management at Limelight Networks states: “Broadcasters face increased pressure to make, deliver, and monetise live video to keep up with new competition from social media giants and microblogging sites. More consumers are cutting the cord and that is our new reality. For content distributors to stay competitive, they must prioritize their ability to instantly create a TV broadcast-quality experience for all users. Overcoming internet traffic obstacles for uninterrupted viewing is one way of tackling this challenge. Utilising a densely-architected, global content delivery that offers scalability, reliability, and can adapt to new business models, new markets, and increasingly mobile audiences will guarantee your live video strategy is a success.”

 

We will be at BVE 2017, at London’s Excel Centre, (February 28th– 2nd March). You can visit Limelight at stand K18. BVE is co-located with the Streaming Video Forum, where Limelight’s Steve Miller-Jones will be leading a session on ‘A broadcaster’s guide to optimising live streaming’, on Wednesday 1st March. The talk will offer tips and best practices for building out a live streaming platform to deliver a high-quality viewing experience, covering topics from transcoding and transmuxing to analytics and protecting the live stream.

This is the first blog in the ARC Light series showcasing the capabilities of Computed Edge Policies. The introduction blog presented an overview of ARC Light, and included a few examples of use cases built around geolocation data.

 

We often receive requests from customers with a need to implement custom business logic for selecting the appropriate Origin. Origin selection logic for a subset of traffic based on predefined criteria is a standard use case. These needs are typically fulfilled by out of the box configuration options within the CDN. In some cases, we encounter a need for origin selection based on complex set of rules. ARC Light extends the standard configuration options of the CDN for these cases.

 

Device Updates by Geographical Location

 

A use case we recently encountered for a mobile device manufacturer required directing traffic to alternate origins for software updates depending on the user’s software version and geographical location. The manufacturer maintains origins in the Americas, Europe, and APAC where some are tuned for delivering larger software updates and others are tuned for delivering smaller updates.  

 

ARC Light solves the problem by executing complex business logic at the edge. The Advanced Request Configuration detects the user’s software version by evaluating multiple pieces of data included within the request header. The CDN automatically generates geographical data for the user upon request. ARC Light combines the version information and geographical data to determine the appropriate origin using the predefined business logic if not served directly from cache.   

 

Speedy Deployment of Powerful Configurations

 

ARC Light provides the flexibility to utilize actionable data included in the header such as URLs, Cookies, Query Strings, and Custom tags to apply logic in real time. Configurations are versatile yet quickly built and deployed across the network to provide a powerful tool for origin selection.

 

More to Come

 

In the next couple of weeks several more blogs will be posted highlighting a specific ARC Light use case. Return to this site to read about more ARC Light capabilities.

Directing users to the websites, files, games, images and other types of content they are trying to reach as fast as possible is vital to provide the best user experience. Content Delivery Networks (CDNs) have been the go to solution for rapidly connecting users to content for over the past decade. Complex data manipulation of user queries by website and application servers are another part of the decision-making process of completing the user connection to specific content. Burdening web servers or application servers with this compute functionality can slow down responses to users.

 

Limelight ARC Light is a new Limelight CDN capability which speeds up  the connection of users to content, and delivery to specific user needs.  ARC Light allows the CDN to use data in user requests to make decisions by using computed edge policies to facilitate real time modifications of client requests and origin responses. Business benefits include speeding users to correct websites for faster transaction completion and enhancing content access security, improving user’s quality of experience, and enhancing brand reputation. The business benefits and capabilities of ARC Light apply to website and any type of content. The complex data manipulations required to perform these actions will be offloaded from your application infrastructure, reducing the latency in this decision process, helping to increase efficiency, and lower the complexity of managing the application or website. The Limelight Advanced Services Team powers the rapid development and deployment of ARC Light computed edge policies, reducing deployment time of complex business logic to the edge from months to days.

 

To better understand the capabilities of ARC Light and the business benefits derived from them, some example use cases include:

  •      Origin Selection: By Geographical Location - A mobile device manufacturer has a software update to push to all devices via a CDN. To deliver the update, the CDN will capture the Geo information and pass it the ARC Light, which will send devices located in China to Chinese origins, devices located in North America to North American origins, etc.
  •      Redirection: Content By Geography - A publishing company has content with licensing restrictions based on location of users. They have content that can only be accessed by users from the US or Australia. When users request content, ARC Light uses Geo lookup data to verify location of the user. Confirmed US-based users are granted access to US content, a user from Australia would be redirected to the library of content available to Australians. Visitors outside the US and Australia are blocked from access.
  •      Request Modification: Language By Geography - An e-commerce company publishes its website in multiple languages. When a visitor from China navigates to the e-commerce site, the geolocation of the visitor is detected, and ARC Light executes a policy to add a query string to indicate the Chinese language version of the site should be utilized. 

 

More to Come

These are a small sample of the Edge Policy Actions that speed users to content that ARC Light can perform. Over the following several weeks, a series of blog posts, each specific to a different ARC Light use case, will expose the variety of tasks that can be performed, improving user experience, enhancing brand reputations, increasing return website visits, and driving increased revenue. See you back here shortly for the next in the ARC Light series of blogs.

There are multiple benefits to knowing where web visitors are coming from. For an ecommerce web site, knowing where potential customers are allows pre-population of country code on forms, displaying different languages, and presenting regional specific content. OTT video services may have licensed content that has limits on where it can be viewed, and geolocation information provides a way to enforce the restrictions. For those of you not familiar with geolocation, it is the matching of an IP address to a geographical location. This third in the series of security technology blogs (the previous two covered DDoS Attack Interception and WAF), will focus on geolocation as it pertains to protecting and accessing content.

 

Where you can get an IP-based geolocation database?

 

There are a several commercially available geolocation databases. Ip2location, MaxMind, and IPligence offer fee-based databases that can be easily integrated into web applications. Most geolocation database vendors offer APIs and example codes in multiple programming languages to retrieve geolocation data from the database.

 

There are also freely available geolocation databases. Some vendors offering commercial geolocation database also offer a Lite or Community edition that provides IP-to-Country mappings. Ip2Country.net and Webhosting.info offer free IP-to-Country databases that can be integrated into web applications.

 

 

How accurate is IP-based geolocation?

Accuracy of geolocation database varies depending on which database you use. According to IPlocation.net, for IP-to-country databases, some vendors claim to offer 98% to 99% accuracy, although typical Ip2Country database accuracy is more like 95%. For IP-to-Region or City, the range is anywhere from 50% to 75% if neighboring cities are included. Considering that there is no official source of IP-to-Region information, 50+% accuracy is pretty good.

 

How Content Delivery Networks use Geolocation data

The primary use of geolocation data by CDNs is to control access to website origins based on user location. In the case of the Limelight CDN, our integrated geolocation database has accuracy down to postal codes and latitude/longitude. For example, a media company live streaming a sports event may have regional blackout restrictions in their licensing agreement. Using geolocation data, the CDN will block access to the streams for users in the blackout region. Most licensed video on-demand content comes with regional viewing restrictions, so delivery CDNs will use the same method as in the live streaming example to enforce the agreements. A very important additional service CDNs perform as part of enforcing regional viewing restrictions, is allowing or denying access if an end user is utilizing a known Anonymizer service. An anonymizer allows a user to disguise their location to gain access to blacked-out content.

 

More to Come

The next blog in this series will cover securing content in motion with HTTPS. Also, there will be more to come about events that may occur related to security issues.  See you here next week!

There are multiple benefits to knowing where web visitors are coming from. For an ecommerce web site, knowing where potential customers are allows pre-population of country code on forms, displaying different languages, and presenting regional specific content. OTT video services may have licensed content that has limits on where it can be viewed, and geolocation information provides a way to enforce the restrictions. For those of you not familiar with geolocation is the matching of an IP address to a geographical location. This third in the series of security technology blogs, will focus on geolocation as it pertains to protecting and accessing content.

 

Where you can get an IP-based geolocation database?

 

There are a several commercially available geolocation databases. Ip2location, MaxMind, and IPligence offer fee-based databases that can be easily integrated into web applications. Most geolocation database vendors offer APIs and example codes in multiple programming languages to retrieve geolocation data from the database.

 

There are also freely available geolocation databases. Some vendors offering commercial geolocation database also offer a Lite or Community edition that provides IP-to-Country mappings. Ip2Country.net and Webhosting.info offer free IP-to-Country databases that can be integrated into web applications.

 

 

How accurate is IP-based geolocation?

Accuracy of geolocation database varies depending on which database you use. According to IPlocation.net, for IP-to-country databases, some vendors claim to offer 98% to 99% accuracy, although typical Ip2Country database accuracy is more like 95%. For IP-to-Region or City, the range is anywhere from 50% to 75% if neighboring cities are included. Considering that there is no official source of IP-to-Region information, 50+% accuracy is pretty good.

 

How Content Delivery Networks use Geolocation data

The primary use of geolocation data by CDNs is to control access to website origins based on user location. In the case of Limelight, geolocation capability is integrated into our CDN. The database we use gives us location accuracy down to postal codes and latitude/longitude. For example, a media company live streaming a sports event may have regional blackout restrictions in their licensing agreement. Using geolocation data, the CDN will block access to the streams for users in the blackout region. Most licensed video on-demand content comes with regional viewing restrictions, so delivery CDNs will use the same method as in the live streaming example to enforce the agreements. A very important additional service CDNs perform as part of enforcing regional viewing restrictions, is allowing or denying access if an end user is utilizing a known Anonymizer service. An anonymizer allows a user to disguise their location to gain access to blacked-out content.

 

More to Come

The next blog in this series will cover securing content in motion with HTTPS. Also, there will be more to come about events that may occur related to security issues.  See you here next week!

Raised on X-box and PlayStation, Millennials have high demands when it comes to entertainment. The gambling industry has historically struggled to reach this demographic, with traditional casino games no longer fulfilling their demand for easily accessible, faced-paced, dynamic experiences. However, the rise of digital gambling experiences has transformed the overall dynamics of the global gambling industry. A recent report by the UK Gambling Commission highlighted a growing interest in gambling among millennials, revealing that 17.5% of 18 to 34 year olds participated in a form of gambling in 2014 compared to 10.6% in 2008.

 

The accelerated expansion of online and mobile gambling can be attributed to the rapid uptake in smartphone usage. In the UK, 91% of Millennials own an Android, iPhone or iPad device, representing a huge market for the gambling industry. But, to provide users with the best possible user experience, casinos cannot underestimate the importance of densely-architected content delivery network.

 

 

Commenting on this, Mike Milligan, Senior Director, Product and Solution Marketing, Limelight Networks  recommends: “Online and mobile casinos looking to draw in younger customers must embrace disruptive technologies to offer the engaging content that users want, but they must also guarantee that the customer’s user experience isn’t compromised by second-rate digital infrastructure. A high-performing CDN can guarantee that your website delivers on users’ expectations through fast downloads, superior streaming video delivery, cloud leverage and, of course, the highest-level site security. Gambling providers need to consider carefully these requirements as they look to their millennial outreach efforts and to the future.”  

 

Limelight will be at ICE 7-9 Feb at London’s Excel Centre where they will be talking about innovation within the gambling industry, as well as the gaming industry as a whole. Steve Miller-Jones, Senior Director of Product Management, Limelight Networks, will be taking part in the panel ‘Cross-Platform & Multi-Channel Gaming’ that will be looking at how the gaming industry can better understand how to design new experiences for new channels. 

A highly anticipated halftime performance from Lady Gaga, followed by the largest point deficit comeback and only overtime game in Super Bowl history drew 111.3 million viewers and 48.8 rating – the third-highest metered rating in the game’s history according to FoxSports. Adding in 600K viewers on Fox Deportes and 1.7 million streams on Fox Sports Go raises the total to 113.6 million. For Fox, the game was the network’s highest metered market rating ever.

 

The Super Bowl first exceeded the 100 million viewer mark in 2010, and since then there has been steady growth with minor fluctuations, as shown by data in Variety from the past several Super Bowls:

 

  •       2017 – 48.8 overnight rating with a 70 share and 111.3 million viewers (Lady Gaga’s halftime performance generated a 50 rating)
  •       2016 – 49 overnight rating with a 73 share and 111.9 million viewers
  •       2015 – 49.7 overnight rating and 114.5 million viewers (All-time most watched), spiking to 120.8 million in the fourth quarter as the score tightened.
  •       2010 – 106.5 million viewers

 

It’s important to note that the 3% drop in this year’s rating compared to 2015 is encouraging as NFL ratings over that same period dropped 10%.

 

Overall, about 60% of game traffic we delivered in the US was to the East coast. The top regions included Chicago, greater NY/New England, Washington DC, Atlanta, and Los Angeles.

 

More to Come

Stay tuned, as more detailed regional data becomes available in the next day or I’ll post new insights on viewership of the game.

Mike Milligan and I just finished presenting the live webinar, The State of Online Video: The Consumer Is in Control where we revealed key findings from the report and also discussed strategies organizations can implement to deliver great online video experiences to their audiences.

We received so many great questions from webinar attendees that we didn’t have time to address them all, so we are continuing the conversation right here on Limelight Connect!  To join in on the conversation, send us a question by using the “Comments” field for this post, and we’ll reply right away.

And in case you missed the webinar or want to see it again, it is available to watch on-demand. View now.

Significant changes have taken place since NFL Media began live coverage of Super Bowl festivities in 2004.  At the time there was a mere 14 hours of programming for the week. This year’s Super Bowl week features more than 80 hours of live coverage on NFL Network, plus SuperBowl.com, NFL Mobile, and social media providing an unlimited variety of content for fans around the world. Large sports events continue to be where new video technology is showcased, and the tradition continues with this year’s game.

                            

360 Replay Technology

                                                           

Intel’s “Be the Player” feature is probably the most innovative new technology as it enables virtual views from any location on the field, giving fans a view close to that of a player. To accomplish this, Intel installed 38 5K cameras high above the field at NRG Stadium in Houston, bolted onto the building’s roof structure. The visual data is fed to servers that digitally reconstruct the 3D images of the game. Control of the views will be via a pilot taking requests from Fox broadcasters. Someday in the future fans will be able to control their own virtual camera, seeing plays and the field from any angle.

 

Audience Expected Watching Methods

 

                                          

                          

It’s expected that about 16% of US fans will watch via live-streaming video apps, according to a survey conducted by Survata, with results reported in an article in Variety. Fox Sports Go apps will deliver free live-streaming of the game and halftime show. 71% of respondents plan to watch the game at home on Fox on TV with a pay-TV subscription. Because in the US NFL games are only available on smartphones of Verizon Wireless customers, only 2% of respondents plan to watch on mobile. 7% said they will watch at a bar.

                           

                                          

Breaking out data from millennials reveals they will stream the game via apps or online, and 57% will watch at home on TV. Super Bowl ads continue to be popular, with 43% of respondents likely to rewatch their favorite commercials online.

 

Delivering the Highest Quality Video

 

Fox will deploy multiple CDNs for delivery – and will observe the performance across the CDNs looking at rebuffering rates and stream start times and errors. The traffic split between the CDNs will be managed in real-time.

A Super Bowl streaming first will be custom digital ad insertions for 170 of Fox affiliates. Most spots will be the same for broadcast and online, specifically the pricey national ads, however online viewers will see some ads unique to digital, and in some markets, ads specific to an affiliate.

 

More to Come

 

After the game, I will provide actual viewing metrics and insights in my next blog. Enjoy the game and see you here next week!

At Limelight, we’re taking the technology lead by innovating at a faster pace and bringing to market what our customers want and need in order to create the world’s best content experiences.  2017 appears to be a banner year for product innovation and customer support. We’re planning a major refresh of our Orchestrate Platform with significant enhancements to our infrastructure, software, and services. Here’s a look at what’s ahead:

 

  • Optimized TCP/IP Implementation: Limelight has teamed up with two of the largest video content and delivery companies in the world to develop a next generation TCP/IP stack. The new environment will enable us to deliver video and live streams with higher quality than ever seen before. Watch for details in the coming months.

 

  • Arc Light: Arc Light harnesses the power of Limelight’s edge servers to facilitate real-time modifications of user requests and origin responses. You’re able to process specialized tasks right at the network edge to improve the user experience, provide a faster time to transaction, and enhance content access security.

 

  • Intelligent Ingest: This capability automatically accelerates the ingest of entire content libraries into Limelight Cloud Storage Services for faster delivery. User-requested content not in the CDN cache can be automatically retrieved and uploaded or you can provide a manifest of content to upload. These new capabilities help simplify your content management processes.

 

  • Security: A new suite of security features and functions for small to mid-sized businesses that offers the ability to protect websites, applications and their associated origins from malicious requests.

 

If you’re interested in participating as a pilot customer for these new services or want to learn more, please reach out to your local sales representative. You can also respond to this blog post in the Comments section or email us at limelightengage@llnw.com. Your feedback is appreciated, and it can be the catalyst for additional changes and improvements to provide you with even better services.

Are your web applications protected from cyber threats? Theft of consumer data from popular websites was featured prominently in the news in 2016. Yahoo! announced that it had yet another massive attack with data from more than 1 billion user accounts compromised, making it the largest breach in history. This blog is the second of a series covering protection of your content, applications, and access to them, and will discuss the state of web application threats and defense strategies.

 

Most observers expect the frequency of cyber-attacks to increase during 2017, part of the reason being the ease by which attacks against web applications can be launched. The application layer is hard to defend, being exposed to the outside world. This is because in order for an application to function, it must be accessible over Ports 80 (HTTP) and 443 (HTTPS). For a good discussion and demonstration of how the most common web application attacks are performed, read the article and watch the embedded videos from SecurityIntelligence.

 

Defending Against Web Application Attacks

 

There are two fundamental ways to protect against attacks: On premise Web Application Firewall (WAF) network nodes, and cloud-based protection. On premise hardware based WAF network nodes deployed between the internet and an organization’s network, have been a popular solution. These devices contain software that can detect the signatures of attacks, and only pass legitimate traffic through to the network. Because all traffic to a website must pass through the WAF so it can detect and block attacks, there is a significant impact on performance of web applications. The reality is on premise WAF nodes are almost passé.

 

What is rapidly becoming the go to solution is cloud-based defense. This is implemented by locating WAF nodes between origin servers and a global Content Delivery Network (CDN), which does the heavy work of content caching, web acceleration, and delivery of static content to websites. Web app attacks are dynamic, so this is the only traffic the CDN forwards to the WAF nodes. This minimizes the performance impact of WAF protection, and locks down IP traffic, as the WAF only accepts traffic from the CDN. The WAF detects attacks by filtering traffic according to rules from the Open Web Application Security Project (OWASP) ten most critical application security risks. In addition, a security operations centers monitors dark Internet blogs and industry bulletin boards for new threats. When a new vulnerability is identified, an operations center creates a new security rule and pushes it to all WAF nodes. Even “zero-day” attacks can be closed prior to app vendor patches. The scalable cloud-based architecture results in a low total cost of protection of WAF services.

 

Best Practices Right Now

It will take time for the necessary application security vulnerabilities to be patched. In the meantime there are steps organizations can take to protect themselves.

  •      Implement the latest state of the art web application cyber-attack defenses. This means at the very least cloud-based protection integrated with a CDN.
  •      Make sure all web application patches are installed. If you have custom web applications, understand how the popular cyber-attacks are architected as described in the article from SecurityIntelligence, and that your applications are designed to prevent these attacks.

More to Come

The next blog in this series will cover securing content in motion with HTTPS. Also, as part of this series will be updates on events that may occur related to security issues.  See you here next week!

Webinar title:

Key Strategies and Best Practices To Proactively Protect Digital Assets and Apps

 

Description:

In the last few months we have seen cyber-attacks on corporations as diverse as Deutsche Telekom and Tesco. From the boardroom to the backroom there is an increasing awareness of complex cyber-security attacks. Businesses need a proactive solution that not only protects digital content and properties but also keeps one step ahead of these evolving threats. 


Join us on 18th January for a live webinar with Limelight's digital security experts, who will share specific use cases to illustrate key strategies and best practices for a cloud-based security solution. 

 

Presenters:

Charlie Kraus,Sr. Product Marketing Manager, Limelight Networks & Kerrion Burton-Evans,Solutions Engineer, Limelight Networks

 

Duration:

60 min

 

Date & Time:

Jan 18 2017 2:00 pm

 

Timezone:

United Kingdom - London