Cloud computing has dramatically changed the way we live, work, and study since its inception. Now, cloud computing is the emerging buzzword in Information Technology and it’s growing day by day. Businesses and organizations have already moved their computing tasks to the cloud which proved to be an efficient way for data storing and processing. However, cloud computing is not efficient enough to handle the growing quantity of data generated by the Internet of Things (IoT). So what can be done to overcome the limitations of current cloud centric architecture?
The answer is Edge computing. Today, computing is migrating from the on-premises server to the cloud server, and now, progressively from the cloud to the Edge server where the data is gathered from the starting point. Edge computing pushes the computational infrastructure closer to the data source where it is needed to ensure swift access. We compare the two computational models to understand how some of the issues in the cloud computing model may be resolved by the edge computing paradigm.
What is Cloud Computing?
Cloud computing is the on-demand delivery of computing resources including servers, storage, databases, and software over the Internet rather than a local server or a personal computer. Cloud is a distributed technology platform that leverages sophisticated technology innovations to provide highly scalable environments that can be remotely utilized by businesses or organizations in a multitude of ways. Businesses have moved to cloud for better scalability, mobility, and security. The cloud provides comprehensive storage, processing and management mechanisms. Businesses can avoid the upfront cost and complexity of maintaining their own IT infrastructure, and they simply pay for what they use and when they use it. Some of the major cloud service providers include Google, Oracle, Microsoft, IBM, Cisco, Verizon, and Rackspace. An example of cloud computing is Microsoft Azure.
What is Edge Computing?
Edge computing is a distributed computing paradigm which brings computation closer to the network edge, as opposed to the conventional cloud computing structure. The idea is to extend the cloud computing to a more geo-distributed manner in which the computational, networking and storage resources can be distributed across locations that are much closer to the end- user applications where it is needed. The objective of edge computing is that computing should happen at the proximity of data sources. In edge computing, things not only act as data consumers but also play as data producers. The edge computing has played a major role in overcoming the complexities in cloud computing – the nearby devices are used as servers for providing better services. It shares the idea of moving the computation closer to the edge devices to overcome the limitations such as bandwidth, latency and high cost in traditional cloud computing.
Difference between Cloud Computing and Edge Computing
Definition
– Cloud computing is the on-demand delivery of computing resources including servers, storage, databases, and software over the Internet rather than a local server or a personal computer. The word ‘cloud’ is a metaphor for the Internet and ‘cloud computing’ is a type of internet-based computing that means storing and accessing data and programs over the Internet. Edge computing, on the other hand, refers to the deployment of data-handling or other network operations away from the cloud servers, and towards edge of the network where the data is gathered from the origin point, in order to overcome the limitation of the conventional cloud computing.
Architecture
– Cloud computing architecture refers to the many loosely coupled components and sub-components required for cloud computing. It defines the components and the relationships between them. Cloud computing is the delivery of IT infrastructure and applications as a service on pay-as-you-go basis to individuals and organizations via internet platforms. Edge computing is an extension of the conventional cloud computing, coming up with a distributed computing paradigm which brings applications and data closer to the network edge towards the sources of data capture, in order to address the issues such as response time, data security and power consumption.
Benefits
– In edge computing, things not only act as data consumers but also play as data producers. It facilitates the operation of compute, storage and networking services between the end devices and cloud computing data centers. The cloud requires a lot of bandwidth and the wireless networks have limitations. But with edge computing, the amount of bandwidth is significantly reduced. Because the nearby devices are used as servers, most of the issues such as power consumption, security, and response time are addressed efficiently and effectively. Edge computing is used to increase the overall performance of the IoT.
Cloud Computing vs. Edge Computing: Comparison Chart
Summary
The cloud provides comprehensive storage, processing and management mechanisms. However, cloud computing is not efficient enough to handle the growing quantity of data generated by the Internet of Things (IoT). This is where edge computing comes to the picture. Edge computing pushes the computational infrastructure closer to the data source where it is needed, in order to address the issues such as response time, data security and power consumption. Just like cloud computing, edge computing provides compute, storage and applications to be consumed by the end-users. However, edge computing has a much bigger geographical distribution and bigger proximity to the end-users.