SBN

Powering Low Latency Live Streaming

Live streaming has grown immensely over the years.  Today, we can watch live events from all around the world via our laptops, mobile phones, set-top-boxes, Smart TVs, and other connected devices.

A big challenge with live streaming is achieving the same level of availability as over-the-air broadcasts and cable television.  Live streaming over the internet is particularly vulnerable to network loss and latency and servers going down. Typically, these challenges are overcome with large buffers on the client and server side.  In particular, live stream content is buffered anywhere between 10 to 50 seconds behind live. As a result, if there are temporary networking or server issues, there is more time to recover before the client encounters rebuffering of the live stream.  However, the tradeoff with larger buffers is that playback will be at least 10 to 50 seconds behind live. In today’s hyper-connected world, if a live stream has a delay of 50 seconds, you’ll hear about that touchdown or goal kick on your Social Media Feed well before you see it. This has the potential to ruin a completely immersive live streaming experience.

Akamai is pushing the bounds of live streaming to achieve lower latencies and still retain high quality playback.  Consumers want to see a reliable live stream in real-time and Akamai has built a number of key capabilities to reach end-to-end latencies as low as 1 second.

Akamai’s live streaming architecture for low latency has 5 distinct server tiers:

LLS.png

  1. Ingest – Ingests the live stream into the Akamai network and dual-POSTs to Origin

  2. Origin – Stores and replicates content into 3 diverse locations.  The 3 Origin servers are dynamically selected and are called Lead, Hot Backup #1, Hot Backup #2.

  3. Mid-Tier – Discovers and fetches content from the Origins with fault-tolerance

  4. Origin Shield – Protects the Mid-Tier and Origin servers

  5. Edge – Delivers content to Players with global scale

Each tier is optimized with unique features that enable high throughput, fault-tolerance, and low- latency.  Here are the 6 most noteworthy features:

  • Optimal Ingest Selection (Encoder to Ingest) – Encoders are dynamically assigned to the best choice Ingest server based on load and network conditions.  This ensures the stream can enter the Akamai network with high reliability and minimal network path latency between the Encoder and Ingest server.

  • Multi-Path Replication (Ingest to Origin) – Ingest forwards multiple copies of the content along diverse network paths to the Origins. This mitigates impact to the live stream if one network path has transient breakage because a second path is still available and functioning.  There is no headroom for retries and delays with low latency live streaming. Always-on redundancy is the only guaranteed way to achieve reliable low latency streaming.

  • Resilient Origin (Origin to Origin) – Content is stored on 3 Origin servers.  These Origin servers are dynamically selected based on server health, network conditions, and data center diversity.  Origins servers replicate content between each other where there is a Lead Origin and two Hot Backup Origins, thus providing 3-way storage redundancy for the content.

  • Near Zero Hand-Waving Latency (Mid-Tier to Origin) – Origin servers make an object available for download as soon as the first byte of the segment is received from the Encoder.  The Origin server doesn’t wait for the entire object to be received before making it available for download. As more data comes in from the Encoder, it’s pipelined directly to the Edge via HTTP chunk encoded responses.

  • Optimized for High Throughput (Mid-Tier to Origin) – Mid-Tier downloads from best choice Lead and Hot Backups Origins.  Being able to download from multiple Origins with different paths ensure resiliency against temporary networking and server issues.  This, in turn, ensures that high throughput and low latency is maintained.

  • HTTP Edge Server Footprint (Player to Edge) – Since content is served via HTTP, this architecture utilizes the Edge tier’s highly scalable, massive global footprint of HTTP caching servers.

These features are the pillars of Akamai’s highly-resilient low latency live streaming architecture.  This architecture powers cutting edge workflows that include CMAF Ultra Low Latency Streaming where hand-waving latencies can be as low as 1 second.

It should also be noted that high performance and reliability often require higher server utilization from more operations per seconds (ops), higher bandwidth costs for replication, and higher storage costs for storage redundancy.  However, Akamai has focused efforts to provide cost effective solutions by tuning server performance, optimizing traffic routing, and virtualizing server instances.

With the ubiquity of live streaming in today’s era, consumer expectations for good quality and high performance will only grow.  Akamai will continue to push the envelope to improve performance and resiliency with more innovative features and continue to raise the bar for the industry.  Stay tuned for more blog posts where we’ll compare the performance improvements of this low latency live streaming architecture versus traditional non-optimized architectures. In the meantime, to learn more about Akamai’s live streaming solution click here.

*** This is a Security Bloggers Network syndicated blog from The Akamai Blog authored by Chandan Rao. Read the original post at: http://feedproxy.google.com/~r/TheAkamaiBlog/~3/kbHvxq-3LBI/powering-low-latency-live-streaming.html

Secure Guardrails