I had the opportunity to be among a diverse group of industry analysts and influencers that Cisco Systems brought to their C-Scape conference in Dubai during the IoT World Forum at the end of 2015. I am intrigued by Cisco’s IoT strategy and it was fruitful to learn more about a term they coined “Fog Computing” which is the subject of my latest research below.
For millennia, humans consulted the Oracle of Delphi for decision making. Throughout decades, humans have tried different methods to make better decisions and enable a better life. Not only humans, but also animals have both fast, low-accuracy and slow, high-accuracy mechanisms available, which can used adaptively according to current contexts. Bees, for example, scan several times to gather information about flowers and make sure no predators, like spiders, exist. Then successful forager bees perform a “waggle dance” inside the hive to indicate the distance and direction of food sources to waiting bees or to new nest-site locations.
However, colonies might form faster when their original nest is destroyed. Let’s take bats for example: bats are capable of capturing information by emitting ultrasonic sounds to know the location of a potential predator and attack it in milliseconds.
When we think about latency-sensitive decision making in the Big Data age, we might need to capture streaming data just like the bat and perform an action in milliseconds. Thus, we capture and analyze data at the same time in a local context. That’s different from other kind of decision making processes which follow Bloom’s taxonomy where we remember or persist the data, understand it, apply (filter) it, analyze it, and then let data scientists evaluate it. I’ll cover this taxonomy in later posts.
With the emergence of Smart Grids, Smart Connected Vehicles, Smart Traffic Light Systems, Smart City, Industrial Automation, Precision Agriculture, and data-driven healthcare, there is a growing need to cover these geographically-distributed and latency-sensitive applications. Traditional cloud computing platforms are not suitable to compute and process such type of applications in a timely manner.
For that sake, we have witnessed the emergence of fog computing. The idea of fog computing and the different communication paradigms that will emerge in the future are focused on making the data smarter where we move the processing, analytics, and intelligence to the data at the edge of devices. That will eventually eliminate the latency while sending the data back and forth to the data centers in the cloud. That being said, such paradigms won’t replace existing cloud computing platforms that are needed for different type of applications which are not latency-critical.
We can think about fog computing as represented in the figure below, with its six dimensions, along with various interesting use cases. The contextual distribution of the devices along with latency requirement of decisions are the main pillars. I will be covering these topics in more detail in later posts.
Feel free to share your thoughts about fog computing and potential use cases in your industry.
This work is licensed under a Creative Commons Attribution 4.0 International License.