All Articles

What the fog? Rise of the ‘intelligent edge’

It’s unusual that I stumble across some form of technology in ‘mainstream’ news rather than spotting it early on Hacker News, but that is exactly what happened as I was reading this weeks issue of The Economist

Life on the edge: The era of the cloud’s total dominance is drawing to a close

‘Fog Computing’ is a new term for the up-coming form of computing which is meant to be the next evolution of cloud.

Cloud Computing today can be (very simply) described as pushing all your work up to some big data center somewhere for processing. Typically this is a public cloud (AWS, Google Cloud, Azure being the big three) and you as a developer have been abstracted away from the actual infrastructure and can just deploy stuff.

Fog Computing is now a recognition that the device producing data in the first place, be it a phone, tablet, assistants like Google Home or Echo are powerful in their own right and a lot of computing can be done on the device.

This has many benefits, it reduces the amount of workload that needs doing in the cloud (scalability) and also helps combat some of the techno-panic going on at the moment. This is particularly present in finance but also more broadly thanks to the new GDPR regulations coming in. With ‘Fog Computing’ the data captured never has to leave the device in some cases or at least can be kept inside the same country.

There are two variants of this architecture:

  1. On device computing— as seen with the releases of systems like TensorFlow binaries which can run a full model in an Android app. The idea is you would use the big expensive cloud to train your model on your vast dataset (terra/petabyte scale) and then distribute the trained model to the ‘edge’ (mega/byte scale). This way you can locally get your inference result and action immediately without the long round trip to the cloud and back.
  2. Edge computing— Cloudflare recently released ‘Workers’ which allows you to operate on HTTP requests at their edge nodes/points of presence. Companies have been doing this for years on a larger scale — for example, Netflix puts its own edge cache hardware in last mile data centers across the planet. AWS have even started running Lambda inside its CloudFront edge locations to a similar effect.

This is a development I will be watching closely not only as a way to reduce latency but also cost as you are now offloading that work to the user’s devices using your app/service. The best source of the direction of this technology is probably from the entertainingly named ‘Fog World Congress’!

Published Jan 22, 2018

cloud native product manager in london