Cloud computing, fog computing—the terms themselves reflect the hazy understanding people have of what those things mean. By way of analogy, imagine a central bank where everyone deposits their money and manages their accounts. The storing and processing of their cash is done off-site (as opposed to stuffing their mattresses with cash and running the risk of fire or theft). The same is true of cloud computing, except instead of storing and processing cash, customers store and process their computer data. Now, imagine instead of just one central bank, we have bank branches, ATMs, and financial services locations scattered throughout the city. That is what fog computing is like. Instead of accessing a central data location, which could be quite far away, your computer devices can access storage and processing power from IoT sensors, devices, and resources scattered all around you. Spreading out the workload speeds up latency-sensitive applications such as gaming and video streaming. However, getting all these IoT devices and sensors working in harmony with each other is a tall order, especially considering how the business needs and data requirements of customers change vastly and frequently from day to day.
Read article (login may be required for full text)
About Lori Cameron
Lori Cameron is a Senior Writer for the IEEE Computer Society and currently writes regular features for Computer magazine, Computing Edge, and the Computing Now and Magazine Roundup websites. Contact her at l.cameron@computer.org. Follow her on LinkedIn.