The Edge: The Computing System That Will Replace the Cloud
In the previous decade companies in every industry put a huge emphasis on using Cloud based data collection and computing. The advantage to this was that their systems were centralised and accessible, allowing increased levels of security and collaboration. Although, at the start of the 2020s, we are beginning to witness a shift from Cloud Computing to Edge Computing.
What is the Edge?
Whilst Cloud Computing happens at a central location of the internet, Edge computing takes place at the ‘outside Edge’ of the internet, close to the data source. For example, data analysis will take place onboard or adjacent to a connected camera, as opposed to the camera sending the data to a centralised location, where it would then be analysed.
Self-driving cars are a perfect example of Edge computing. When a person is driving their car down a road, they observe the road and the surroundings in real time and stop instantly if somebody where to walk out in front of their car. For self-driving cars to be safe, they must also respond in real-time in the way a normal car and driver would. To do this they would process visual information from on-board cameras to make decisions on when to stop. This decision making would be done at the Edge as sending the information to a centralised source would cause a delay, meaning that the decision to stop would be made too late, making the self-driving car unsafe.
Centralized Cloud systems provide an ease for collaboration and access, but they are remote from the data sources. This means the data needs to be transmitted to the data source from the cloud, which due to network latency, introduces delays. For self-driving cars, it is critical for there to be the shortest possible time from data collection to decision making, so using Edge computing the data is analysed on-board and the necessary decision is actioned in real time.
Why move from the Cloud to the Edge?
As outlined above, with Cloud computing devices on the Edge play a limited role, sending raw information and receiving processed information. All the real work is done in the Cloud.
Computing at the Edge results in less data transmission, as most of the work is done at the Edge. This means that, instead of raw data being sent from the edge and processed data being sent back to it, the device at the Edge processes the raw data and only sends the end result to the Cloud, therefore less data transmission bandwidth is required. This means that the cost of transmitting data is reduced.
If you wanted to add more devices to your system, if you have a Cloud based computing system, the bandwidth required to do so increases, as more data is being sent back and forth to the centralised Cloud for processing. Network traffic would increase and the uplink bandwidth would become a bottleneck. This would make the process costly as the network traffic, bandwidth and Cloud resources would need to increase with every additional device.
Whereas, if you were using an Edge based computing system, the bandwidth and Cloud resources do not need to be increased per device as most of the processing is done at the edge. This means that there is less cost involved in scaling your system as there is no need to increase your Cloud computing power or your network bandwidth. Therefore, Edge computing systems are more scalable than Cloud computing systems.
With the demand for Edge based computing systems growing, it is likely that by 2024-2025 we will see many businesses transition away from Cloud based systems to the Edge as demand for real-time applications and the ease of scaling systems will mean that the Edge will become a technology which could potential help shape the look of data storage and analysis over the next decade.