top of page
  • smartairqualitymon

Fog resource estimation and its challenges

Fog computing layer brings computing, network, and storage services closer to the end-nodes in IoT. Compared to Cloud computing, this computing layer is highly distributed and introduces additional services to end-devices located in the perception layer. This bridging layer is referred differently but with similar or small variations in purpose. For example, edge computing, micro-cloud, or cloudlet are some of the related terms. Regardless of the name, the concept of introducing an intermediate computing layer in IoT is motivated by the similar set of challenges. Moreover, the possible set of services that can be potentially integrated in the Fog computing layer are vast. Some of these services are a scaled version of the ones provided by cloud computing while most of these services emerged recently in response to IoT challenges (Negash, 2018).

Internet of Things (IoT) is a concept where an object has the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction. IoT has evolved from the convergence of wireless technologies, micro-electromechanical systems (MEMS), and the Internet. “A Things” in the Internet of Things can be defined as a subject such as a person with an implanted heart monitor, a farm animal with a biochip transponder, a car that has built-in sensors to alert the driver when tire pressure is low. By far, IoT is most closely related to machine-to-machine (M2M) communications in manufacturing and electricity, oil, and gas. Products built with M2M communication capabilities are often referred to as intelligent or “smart” systems. For example, smart cables, smart meters, smart grid sensors. Research on IoT applications is still in the development stage. Therefore, there is no definition of the Internet of Things.

The first fog computing was introduced by Cisco. Compared to cloud data centers, fog provides a virtual computing environment that runs between the cloud and end users. Cloud and fog have the same function to serve users, it's just that fog is to serve certain specific areas. Fog computing was developed for delay-sensitive IoT applications. In the cloud, the farther the distance between the cloud server and the end user, the higher the latency. Fog Computing supports emerging Internet of Things (IoT) applications that demand real-time / predictive latency (industrial automation, transportation, sensor and actuator networks). Thanks to its wide geographic distribution, the Fog paradigm is well positioned for large real time data and real time analysis. Fog supports densely distributed data collection points, thus adding a fourth axis to the frequently mentioned dimensions of Big Data (volume, variety, and speed).


Figure Overview of fog computing architecture


Based on the block diagram above, fog nodes are located between cloud nodes and end devices. Fog nodes can also be called smart gateways. At these fog nodes, data sensing from end devices is not only continued at the cloud nodes, but data processing is also carried out if the computational load of the fog nodes is not in a high condition. The computing process is carried out in a distributed manner at the fog nodes and cloud nodes. Fog can be run on edge network devices such as routers, access points, Road Side Units (RSUs) and other devices. With the development of fog nodes, reliability, fault tolerance and scalability can be adjusted easily. The application of fog computing is more appropriate for the implementation of Smart Gateway.

The concept of fog computing is to bring networking resources near the nodes that are generating data. We refer to it as perception layer since data is to be perceived from there. Fog creates another layer on top of perception layer, while cloud resides above all in the cloud layer. Fog resources lie between the perception layer and the cloud layer. Fog computing is an extension of the traditional cloud computing paradigm to the edge of the network, helping to create more refined and context-aware services. For mobile nodes, like moving vehicles or drones, fog provides low latency and high quality streaming through proxies and access points located accordingly along highways and tracks. Likewise, resource and power constrained individual nodes, WSNs, and Virtual Sensor Networks (VSNs) would be able to take advantage from the presence of fogs. Fog also suits services related emergency and disaster management, gaming, healthcare, augmented reality, graph/data mining, etc. But it would hugely depend on how resources are managed in fog computing. Resource management must be very dynamic and in accordance with the type of service and type of devices. Communication means also play an important role, since being mobile or static means that different resource allocation is required even for the same service. One of the ways to dynamically estimate resources is to incorporate the usage pattern and history of the customers of a service. As history will give predictability to the fog, tailored resource estimation can be performed.

Fog computing is not global like cloud computing, but has a much faster analysis speed than cloud computing. Much of the data generated from IoT doesn't have to be handled globally (with cloud computing), so fog computing can fill the gaps. The concept of fog computing is as a bridge from IoT to the cloud. All data generated by IoT will be analyzed by fog computing first. Fog computing will handle all the analysis needed to be addressed quickly, and the results of the analysis carried out by fog computing will later be forwarded to cloud computing. So, fog computing will be a substitute for cloud computing as long as cloud-scale management is not needed. So, all problems that must be addressed quickly can be done immediately. In the future, foq computing will have several challenges including:

  • make arrangements and add node management algorithms such as scheduling algorithms and handover management mechanisms.

  • The fog computing infrastructure is divided into several layers including data acquisition layer (DAL), event classification layer (ECL), information mining layer (IML), decision-making layer (DML) and cloud storage layer (CSL). To achieve efficient application services, each layer performs the required function.

  • Requires a new framework and architecture to further improve network quality, data security, and processing speed

2 views0 comments

Recent Posts

See All

Comments


bottom of page