Introduction: What is Fog Computing? How is it Used in Machine Learning?
IoT is well known for being one of the main sources of Big Data since it connects a huge number of smart objects that continuously report their status. Although the IoT paradigm focuses on the connection of objects and their interconnection, its true potential lies not in the physical objects themselves, but in extracting valuable knowledge from the data generated by these objects. The Internet of Things is not about things, but rather about data. In this context, ML is a useful tool for processing the generated data and transforming it into information, knowledge, predictions, insights, and automated decisions. The use of ML techniques in the Internet of Things (IoT) raises several challenges, especially regarding their computational requirements. Data produced by IoT device can be typically processed in 3 different layers: in the source layer (things), in cloud computing, and in the intermediate or fog layer.
The performance of each layer depends on the QoS parameters and the complexity of the processing required. Current ML techniques impose an overhead on devices of the fog layer, which are resource-constrained, hindering their widespread adoption in this setting. New developments in running ML algorithms show that, due to the limited hardware and power supply of fog devices, the decision-making processes in the fog are not an easy task. On the contrary, a decision-making procedure built upon information extracted from various devices on the network is far more reliable than by taking a limited view of the context alone. We will discuss the challenges to implementing big data analytics in fog computing using the latest AI developments.
Also Read: Impact of AI in Smart Homes.
What is Fog Computing?
Decentralized computing is a distributed computing infrastructure in which data and compute are located somewhere between the source and the cloud. Fog computing is one example of this type of architecture. Like edge computing, Fog Computing brings the advantages and power that the cloud has to offer closer to where data is generated and processed. Both fog computing and edge computing involve bringing intelligence and processing close to where the data is generated. It’s often done to improve efficiency but it may also be done for security or compliance reasons.
The fog metaphor comes directly from the meteorological term “fog” which describes a cloud close to the earth. The term is often associated with Cisco; the company’s product line manager, Ginny Nichols, is believed to have coined the term. Cisco Fog Computing is an official name; fog computing is available for anyone to use. Computing capability improved is a direct improvement of quality of service provided.
How does fog computing work?
Fog networking complements cloud computing; fogging allows for short-term analytics at an edge location, while cloud computing provides long-term analytics. Edge devices and sensors generate data, but they may not have the computing and storage resources to perform complex analytics and machine learning tasks, so they’re often limited to basic functions. Cloud servers have the power to process large amounts of data quickly, but they’re usually too far away to respond in a timely manner for most applications.
Additionally, having all endpoints connected to and sending raw data over the internet to the cloud can have privacy, security, and legal implications, especially if dealing with sensitive data subject by laws in different countries. Fog computing is used for a variety of purposes including smart grids, smart cities and smart buildings. Improving computing capability improves the quality of service provided.
Also Read: Automation in small steps.
Fog Computing and Its Structure
Fog computing is a new paradigm for computing that can be represented as a link between the cloud and an organization’s network’s edge. It provides computing, communication, and storage capabilities at the network’s edges. The decentralized platform differs significantly from previous traditional, architectural computation methods because the fog computing environment connects resources, including people and machines, to improve the quality and safety of life by running cyber-physical social applications on network edges.
Fog Nodes or Microdata Centers (MDCs).
Such applications can gather and analyze data from local microdata centers by fog computing. Local data processing and analytics are performed by the MDC to reduce the amount of data transferred from the edge to the cloud, reduce network latency and enhance overall performance, particularly for time-sensitive applications such as connected cars and healthcare. To reduce network congestion, bandwidth usage, and latency for user requests, MDCs typically are placed between data source and the cloud data center The MDC handles user requests instead of forwarding these requests to centralized and remote cloud servers.
A smart gateway is an important part of industrial IoT applications because it allows communication between the network layer (the Internet) and the ubiquitous sensor network (IoT). IoT gateways are connected devices that connect low-level users who operate within influential datacenters, connect many devices in operation, perform a variety of tasks, and complete the computing purposes. The gate is used to receive sensor data from the environment, incorporate it into the system, and then send it off to the cloud for processing and analysis.
Fog computing has developed in response to the massive amounts data being transmitted to cloud services and the severe latency and network bandwidth limitations that come with them. It acts as an intermediary computing layer between the cloud and IoT devices. It distributes numerous heterogeneous servers. Fog servers contain fewer computing resources than cloud servers, so they’re not ideal for industrial IoT applications. However, because they can be accessed locally, they provide better bandwidth and lower latency than cloud servers.
Fog computing optimizes resource allocation and management by balancing the amount of attention given to different resources. Load balancing is an effective way to manage resources. It can be used in conjunction to task management to create a reliable system.
Storage Subsystem. The fundamental objective of Industrial IoT is to acquire correct information in real time and then react accordingly to provide desired results. Fog and Edge Computing have been used to help improve service quality and user experience through effective distribution of data storage and processing across several geographically distributed locations.
Fog Computing vs. Cloud Computing
Fog computing architectures are made up of fog clusters that combine the computing power of several fog devices. Cloud computing services are the cloud’ s primary physical components, with expensive operating costs and high energy usage. Fog computing uses less energy and has lower operational costs than traditional cloud computing. Because the fog is close to the user, the distances between users and fog devices might be one or a few steps.
Due to distance, the cloud’s communication latency is always greater than the fog’s. Based on geography-based orchestration, the fog represents a more distributed approach, whereas the cloud represents a more centralized approach. Due to its high latency, the cloud does not allow for real-time communication; however, fog computing can alleviate this problem. Fog, on the other hand, has a high failure rate due to its dependence on wireless connectivity, decentralized management, and power outages. These devices are susceptible to failure when their software is not properly managed.
Drawbacks of the cloud-based model include the following:
There are many industrial IoT applications that require strict service delay requirements, particularly for industries and Internet-connected vehicles.
As the number of wireless industrial IoT devices increases, the bandwidth of the link becomes increasingly congested, making it impossible to send all data, including sensor data, to remote clouds for processing.
Industrial IoT devices are often limited in their computational power due to energy and cost constraints.
What is edge computing? Everything you need to know
Fog computing vs. edge computing
The key difference between edge computing and fog computing, according to the OpenFog Consortium launched by Cisco, is where the intelligence and computing power reside. In a foggy environment, intelligence is housed on the local area network (LAN), and data is transmitted from endpoints to the fog gateway for processing, and then returned to the endpoints by way of the fog gateway.
Intelligence and power in edge computing can exist at either the endpoint or the gateway. There are few points of failure with edge computing because each device independently operates and decides which data to store locally and which data to send to a gateway or cloud for further analysis. Fog computing proponents argue that it’s more scalable, as multiple data points can feed data into it, and that it provides a better overall picture of the network.
Despite this, some network engineers believe fog computing is simply a Cisco brand for one approach to edge computing.
How and why is fog computing used?
There are any number potential uses for fog computing. Traffic control is one increasingly common use case for Fog Computing. Because sensors are usually connected to cellular networks, some cities use computing resources near the cell towers. Real-time analytics enables traffic signals to respond in a timely manner to changing conditions.
Autonomous vehicles can also benefit from this concept. Due to their vast onboard computing power, autonomous vehicles function as edge devices. The vehicles must be able to ingest data from numerous sensors, analyze it in real-time, and then respond accordingly.
As autonomous vehicles do not require cloud connectivity to operate, it’s tempting to think of them as disconnected devices. Though an autonomous vehicle must be capable of driving safely even without cloud connectivity, it is still possible to use connectivity when available. It is being considered by some cities how an autonomous vehicle might operate with the same computing resources as traffic lights. For example, a vehicle of this type could be used as an edge device and relay real-time data to a system that receives traffic data from other sources. Using this data, the underlying platform can better control traffic signals.
What are the benefits of fog computing?
Fog computing is no different from any other technology. Here are some advantages of fog computing:
By reducing the amount of data that is sent to the cloud, fog computing reduces bandwidth consumption and related costs.
Reduced latency. Since the data processing occurs near the data, the response time is improved. Millisecond-level responsiveness enables near-real-time processing of data.
Although fog computing generally places compute resources at the LAN level, as opposed to the device level, which is the case with edge computing, the network can be included in the fog computing architecture. While fog computing is network-agnostic, it can be used over wired, wireless, or even 5G networks.
Training process can be used with neural networks to improve performance and anomaly detection.
Monitoring performance metrics using smart devices in realtime is a game changer as it allows businesses to adapt in real time based on the performance metrics reported.
Distribution of computing tasks reduces load and distributed computing tasks can enhance performance without overloading the cloud computing services.
What are the disadvantages of fog computing?
Fog computing does, however, have some disadvantages, such as the following:
Physical location. Since fog computing is tied to a physical location, it undermines some of the “anytime/anywhere” benefits associated with cloud computing.
Under the right circumstances, fog computing can be subject to security issues, such as IP address spoofing or man-in-the-middle attacks.
Fog computing uses both edge and cloud resources, which means that there are associated startup costs.
Ambiguous concept. There is still some ambiguity surrounding fog computing’s definition, even though it has been around for several years. Different vendors define fog computing differently.
Fog computing and the Internet of Things
The use of fog computing is common in Internet of things (IoT) applications, since cloud computing is not viable. Smart sensors and IoT devices generate large amounts of data that a cloud-based approach could not handle, which would be costly and labor-intensive to process and analyze. By using fog computing, there is a reduction in bandwidth and in the back-and-forth communication between the cloud and the sensors, which can negatively affect IoT performance.
Fog computing and 5G
The fog computing architecture makes use of IoT devices to receive real-time data from a series of nodes. These nodes process data in real time, with millisecond response times. Information from the nodes is periodically sent to the cloud for analysis. The data is then analyzed by a cloud-based application that provides actionable insight based on the data received from the various nodes.
Such an architecture requires more than just computing power. A high-speed connection between the IoT devices and nodes is required. The goal is to process data within milliseconds. Different use cases require different connectivity options. IoT sensors on a factory floor, for example, are likely to use a wired connection. However, mobile resources, such as an autonomous vehicle, or isolated resources, such as a wind turbine in the middle of a field, will require an alternate form of connectivity. A compelling reason to choose 5G is that it provides the high-speed connectivity necessary for data to be analyzed in near-real time. 5G enables this to be done via cellular networks.
Fog Computing: Outcomes at the Edge with Machine Learning
(Edge) Computing [Source: Cisco] Edge computing (or Fog Computing) is a method of optimizing cloud computing systems by performing data processing near the edge of the network. Edge computing is a natural step following cloud computing. Using the cloud for every device would not be practical. Modern smartphones send all data to the cloud for processing, the data is stored there, and the results are sent back to the device.
A number of examples demonstrate how edge computing gives IoT a competitive edge. In industrial internet of things applications such as aviation, smart traffic lights, or manufacturing, edge devices can capture streaming data that can be used to prevent parts from failing, reroute traffic, optimize production, and prevent product defects. Data analysis at the edge of a network is known as Edge Analytics. Data is more valuable at the edge:
Data has a time value, meaning that the data you have today won’t mean as much a week, a day, or even an hour from now. As a result, organizations are using edge computing to provide real-time analytics that impact the bottom line, and in some cases, prevent disasters from ever occurring. Other factors driving edge computing adoption include the proliferation of IoT sensors, video cameras, social media, and other streaming data.
For data analytics, organizations currently use large and complex clusters. These clusters have a number of bottlenecks, including data pipelining, indexing and extracting, as well as transform and load processes. It is crucial for today’s organizations to have fast and actionable insight by correlating newly acquired data with legacy data in order to maintain or gain a strong competitive advantage by using centralized infrastructures to analyze static or historical data.
As most data is priceless as soon as it is collected, such as the instance of a financial fraud or hacker accessing accounts, it loses all value during the process of moving it to the centralized data center infrastructure or uploading it to the cloud. Slow decision-making is not acceptable when an edge-computing platform can provide near-instant intelligence by eliminating moving data. Using data analytics at the edge, organizations can fight fraud or prevent data breaches in real time.
Three Use Cases for Fog Computing:
Remote monitoring for Oil & Gas operations:
The use of edge computing in Oil & Gas operations can make the difference between a disaster and a normal operation. Centralized data analytics infrastructures can provide insight into what caused downtime or can predict failure using supervised learning based on a trained dataset. Nevertheless, establishing rules that can perform near-instant analysis on the data at the site as it is being created can detect the signs of a disaster and prevent it before it begins.
Machine Learning Models:
Models to detect anomalies (e.g. Kalman Anomaly), predictive models (e.g. Bayesian change detection), and optimization methods (e.g. Linear optimization) Retail customer behavior analytics: Retail analytics to lessen cart abandonment and improve customer engagement using near-instant edge analytics where sales data, images, coupons used, traffic patterns, and videos are created provide unparalleled insight into consumer behavior. Retailers can use this intelligence to target merchandise, sales, and promotions and to redesign store layouts and product placement to improve the customer experience. Using beacons, for example, retailers can collect information from customers’ smartphones such as transaction history, then target promotions and sales items as they walk through the store. Models such as statistical methods (e.g., market basket analysis, Apriori), time series clustering, classification model, neural networks etc.
Self Driving Cars:
The next-generation of advanced driver assistance systems (ADAS) will make cars safer and more efficient by making them increasingly aware of and responsive to the surrounding driving environment and conditions. In order for ADAS to succeed, it must be democratized by making it available to everyone-from first-time drivers to seniors, in passenger vehicles as well as commercial vehicles.
In order for cars to compete with the development and deployment of next-generation technologies and self-driving vehicles in the near future, we need to look at how a collective set of systems inside the car can deliver a better experience compared to viewing the car as a collection of independent technologies.
ML Models: Image Processing, Classification model, Anomaly Detection (e.g. Outlier Detection in Isolation Forests), Reinforcement Learning, VLSAM Technologies, etc.
Combining forces of autonomous vehicles, for cluster forming internet of vehicles to control systems and manage traffic across cities to reduce congestion without impacting cloud resources or the centralized cloud. More power to internet of vehicles.
Securing Industrial IoT Through Fog Computing
Fog’s dispersed design safeguards linked systems by placing computing, storage, networking, and communications closer to the services and data sources, thereby providing an added layer of security. Fog nodes perform a variety of security tasks on networked devices, even the tiniest and most resource-constrained ones, to protect cloud-based industrial IoT and fog-based services. Security credentials, malware detection, and software patch distribution at scale can be managed and upgraded with the fog, which provides a trusted distributed platform and execution environment for applications and services.
Fog provides reliable communication and enhanced security through detection, verification, and reporting of assaults. By monitoring the security status of devices around it, the fog can detect and isolate risks quickly in the event of a security breach. Through the fog, the deployment of blockchains to IoT endpoints can be done at low cost. Operations managers can remotely isolate and shut down affected generators if multiple power generators are attacked with malware using fog’s node-based root-of-trust capabilities. The result is minimal service interruptions. The fog nodes protect domains from hackers exploiting a vulnerability in assembly-line equipment. Once a potential assault has been identified, traffic is monitored from the internet into the distributed fog network, and machine learning is used to detect it.
What is next for fog computing?
By utilizing fog computing, services can be provided more quickly, while bypassing the wider internet, whose speeds are heavily dependent on carriers.
Several companies, including Google and Facebook, are considering alternative means of internet access, including balloons and drones, to avoid network bottlenecks. However, smaller organizations might be able to create a fog out of whatever devices are available to establish closer and quicker connections to their computing resources.
Even though there is a place for more centralized and aggregated cloud computing, it appears that as sensors are integrated into more things and data grows at an incredible rate, a new way of hosting applications will be required. Using fog computing, which can inventively use existing devices, could be the right approach to hosting a whole new set of applications.
However, the shift to the edges does not detract from the importance of the center. In fact, it means that the data center must be a stronger nucleus for expanding computing architecture. The cloud hasn’t actually diminished server sales, as one might otherwise expect, according to InformationWeek contributor Kevin Casey. Big data, IoT, and hybrid computing models have all contributed to server requirements that are shifting, but not abating as some experts had predicted.
While organizations seek to balance enterprise-grade data center needs with support for increasing edge network growth, the IoT poses a relevant bridge to some of the biggest differences between the cloud and the fog (like bandwidth). This includes mobile edge computing.
Fog and cloud computing work in unison to provide realtime analytics that help the businesses make informed data driven decisions while providing cover to ever changing cyber threat landscape and helping businesses deal with security issues.