Why is edge computing needed?
Edge computing is not necessarily a revolution in cloud computing but an evolution of what came before: so why do we need the edge when we have the cloud?
It’s perhaps no surprise that the cloud’s history is hazy. However, the Mix Network, devised in 1979 by David Chaum, is perhaps where the origin of cloud computing comes from.
The Mix Network conceptualised an email network in which the authentication of messages was decentralised. It was from this philosophy that Chaum patented the first decentralised payment system the following year. Further to this, Chaum’s network used the precursor to Onion Routing, most famously associated with the dark web.
It took a while longer before cloud computing as we knew it became ubiquitous. The exact moment is up for debate but in 2019, the cloud is all around us, literally and figuratively. So where does edge computing come in, exactly?
The past couple of decades have seen a significant shift to cloud computing. With edge computing rising in popularity, it makes sense to question its place in the technology world. The gargantuan rise of IoT is calling into question whether the cloud is still the most efficient method.
Where did edge computing come from?
Edge computing is not a revolution following the cloud. It’s an evolution of the technology.
Let’s go back to basics: data needs to be stored somewhere. On-premise computing meant that data had to be stored centrally; cloud computing liberated data meaning that it could be stored on remote servers far away.
Edge computing is the natural progression of this: it brings data closer to where it’s needed. This could be through sensors that collect data, like RFID tags, or on-site data centres that companies invest in. Edge computing can still connect to the cloud if necessary.
The institutions of computers were replaced with the cloud, which in turn became institutions themselves.
However, whether cloud computing truly is decentralised is still a sticking point. Theoretically, cloud computing is very much decentralised but from a scale point of view, this isn’t the case: big cloud services have cornered the market, connecting services and meaning that data is stored and processed from big data centres.
It’s understandable; the institutions of computers were replaced with the cloud, which in turn became institutions themselves. This is how the edge has risen. Where cloud computing took the power from centralised servers, edge computing is taking the power from the cloud.
It’s a necessary next step for us as tech users. The cloud satisfied our needs as computer users and the edge is doing the same thing for IoT devices.
What are the main advantages of the edge?
Just as the cloud provided a raft of opportunities when it came to prominence in the mid-2000s, edge computing has significant advantages, too.
The edge is considerably quicker than the cloud. That’s perhaps the biggest benefit of a data centre being located quite so close and it’s one that is used already for a host of reasons. Autonomous vehicles can receive information a lot quicker from the edge than they ever could from the cloud; smart cities too could process maintenance data as and when it occurs. Smart traffic lights, for example, could receive signals from nearby.
If devices can interact with the edge a lot easier than the cloud, it can also pose questions about how we store data. Data that is stored at the edge does not necessarily need to go to the cloud, which could, in time, change the way that we think of privacy and security issues that dominate discourse regarding the cloud.
One of the other big benefits of the edge is outage reduction. By using the cloud, your business could be open to server downtime: this is an issue that becomes more serious the more you congest a cloud server. Some mission-critical operations such as chemical plants have never used the cloud for this reason.
The edge relies on a connection between individual sensors and a local data centre, which reduces the opportunity for outages. Should different devices connect to different edge servers too, there’s no need for an entire business to drop just because the whole server has gone down.
How long before edge computing takes over?
As IoT’s relevance grows, so does the edge’s. Even individual devices in your home, such as an Amazon Echo, can be classified as an edge device.
The edge is growing slowly. It took years for the cloud to become as universal as it is, so whilst there isn’t a specific date in which the edge will become more important, edge computing will no doubt surpass the cloud in no time at all.