20 Years of Edge Computing
How long will you wait for something? That depends on what you’re waiting for, of course. But in your daily interactions, think about how many “things” you interact with where you expect the response to be instantaneous – tapping on mobile apps, logging in and transacting with a retailer or a bank, selecting and viewing content on a streaming media device, interacting with a connected device like a light, doorbell, or even a car, or checking in for a train or flight. These are all services that require real-time processing of information, at massive scale, and are examples of where edge computing can make the difference between a great experience and a really frustrating one by bringing processing closer to devices and data.
Edge computing is not a new idea, just one that for decades was ahead of the market’s ability to fully appreciate it. High-speed trading or optimizing and localizing services at branch offices are two long-standing examples of pushing business logic closer to where the action is. Modern technology adds use cases including enabling faster decisions for connected cars and other IoT devices or improving network processing speeds with 5G.
At the most basic level, edge computing brings data, insights, and decision-making closer to the things that act upon them. Rather than relying on a central location that can be thousands of miles away, the edge is as near to the “thing” as possible. The goal is ultimately a reliable, scalable implementation so that data, especially real-time data, does not suffer latency issues that can affect an application’s purpose or performance. At Akamai, we have been operating an edge platform that delivers edge computing capabilities for over twenty years. The Akamai Intelligent Edge is broadly recognized as the largest distributed network platform and has been delivering edge computing solutions to our customers virtually from day one.
In the year 2000, centralized computing occurred in data centers, and Akamai implemented our first iteration of edge computing to battle the “world wide wait.” Our platform provided situational performance by assembling and delivering content based on business logic for each customer using a computing paradigm we called advanced metadata. Advanced metadata is an implementation of edge computing that uses XML to describe business logic for a given customer.
In 2001, we worked with other industry leaders to develop a standard edge computing implementation for developers called Edge Side Includes (ESi). ESi helped our customers scale, increase performance, and save money by moving fine-grained business logic that would have been done locally to our edge, reducing the amount of data that their infrastructure needed to process in a datacenter. Fast forward 20 years to today, and this capability is commonly called serverless computing, since the customer can abstract scaling business logic from scaling infrastructure.
In 2002, we pioneered the use of Java and .NET technology at the edge and started to use the term “edge computing” to describe our approach. As we worked with our customers to hone the value proposition for, and the utility provided by, edge computing, we realized that customers were not looking for raw performance from edge computing. Their goals were ultimately programmatic access to build out reliable, scalable and secure implementations of their business logic so that data, especially real-time data, did not suffer latency issues that can affect an application’s purpose or performance.
By leveraging Akamai’s Intelligent Edge as a programmable edge, they could maximize quality and reliability for their consumers, evaluate and optimize personalized web and mobile experiences, and maximize data autonomy and infrastructure and app security at a global scale.
Today we have a myriad of edge computing solutions in use by customers across every major vertical. In fact, edge computing innovations represent approximately 20% of Akamai’s more than 400 technology patents, and edge computing solutions generated well over $2 billion in revenue over the twelve months ended June 30, 2020. Those solutions can be grouped in three categories: content & media delivery, app & IoT optimization, and cloud & enterprise security. Here is a sample:
Today, we are witnessing a massive shift to a maker culture, and nearly all of our customers have infrastructure deployed in the cloud. Makers prefer to have fine-grained control over their compute deployments, which is why Akamai has been investing in open APIs and an API-first approach to development since 2015. In 2017, we introduced our strategy to support the growing IoT edge computing ecosystem which included extending our protocol support to include MQTT and implementing a publisher/subscriber model with an initial use case of connected vehicles. In 2019, we officially launched Akamai IoT Edge Cloud to manage bi-directional messaging updates and deliver insights to endpoints and infrastructure, announced our partnership with MUFG to develop technology to power blockchain ledger updates at the edge, and opened up access to Akamai EdgeWorkers to provide JavaScript-based business logic at the edge.
These edge computing solutions are complemented by tooling to integrate Akamai-as-code into any DevOps toolkit and CI/CD process, including the ability to deploy and test code to sandbox environments and align to automated software quality improvements with global load testing and user session replay on our edge platform. There are hundreds of thousands of lines of code deployed on the Akamai Intelligent Edge to manage customer applications through these various implementations today, and that number continues to grow as our customers transform their digital businesses.
Today Akamai interacts not just with content, but with objects, in order to help businesses realize unparalleled performance and security at scale. The intelligent edge has evolved to receive, store, and act upon objects in a way that is 100% protocol independent – it works with HTTP, video streaming, and IoT applications. Our goal is to deliver the world’s largest and most distributed programmable messaging fabric that makes Akamai part of the application as an evolution from our origins as part of the application delivery. We are excited to share this journey with you and look forward to seeing you at the edge!
*** This is a Security Bloggers Network syndicated blog from The Akamai Blog authored by Ari Weil. Read the original post at: http://feedproxy.google.com/~r/TheAkamaiBlog/~3/3qpOouMFbZk/20-years-of-edge-computing.html