
Revving Up For Edge Computing - SemiTom
https://semiengineering.com/revving-up-for-edge-computing/
======
etaioinshrdlu
Does anyone else feel like "Edge" is made up term just for deluding investors?

The client/server split has been around forever. Edge means the same thing as
a client. With IOT you have more clients. Modern smartphones and ras-pi type
hardware are already great at neural net inference... The future is already
here.

~~~
zzzeek
I work at Red Hat and am involved with edge technologies, while if you look at
it in a generalized way, sure, it seems like "huh, it's client server with
lots of clients" but it does really refer to novel new architectures being
deployed by customers where there are more computing resources out in the
field than would normally be present.

more simply, Chick-Fil-A's post about their edge efforts really made it clear
for me when the term first came out - a miniature datacenter in every store.
this is definitely novel: [https://medium.com/@cfatechblog/edge-computing-at-
chick-fil-...](https://medium.com/@cfatechblog/edge-computing-at-chick-
fil-a-7d67242675e2)

~~~
jka
That's one of the most interesting architectural / conceptual blog posts I've
seen in a while, thanks for sharing!

(it makes me think there must be a ton of interesting scenarios around
applications where images/containers migrate between the core/cloud and the
edge on-demand (or simply have edge replicas), and situations in which
mapreduce style computation could be farmed out to edge clusters before
sending aggregated results home)

~~~
nloladze
So every edge device run it's own Docker container and the fog or the nodes
act as a swarm manager before depositing the relevant information into the
cloud? I've always thought that we would see drone-based air traffic
controllers to manage and handle air traffic, especially as we can gain flying
vehicles in the future.

~~~
jka
I'm not sure I completely grasped the air-traffic control example you
mentioned, but I think you're on the same track, yep. Low-latency, localized
computation within a well-connected physical area.

To try to make it more concrete, here's a scenario:

Imagine that smartphones are able to opt-in to become part of an edge cluster
which communicates over some kind of local network fabric (be that wi-fi / 5G
/ similar).

Now imagine that you have 2,000+ fans at a sporting event or live performance,
all with their phones and many taking live recordings.

Shifting high-fidelity video/audio data from that many devices at an event
back to the cloud in realtime might not be particularly feasible and/or useful
for various network contention, bandwidth, and latency reasons (both client
and server-side).

But if those devices were already running a containerized application to
perform -- say, 3D image stitching[0], for example -- you could collect,
compute and redistribute results via the cluster in near-real-time,
potentially providing some pretty immersive audience experiences (3D
highlights and replays and all kinds of interesting augmented reality
interpolation of audience & performers).

It'd also raise questions around who owns/authorizes the on-device
computation, who has permission and copyright over the data captured, and
various other issues.

All very hypothetical ideas, but technically feasible. Predicting fast food
order demand is much more practical challenge to begin with, no doubt :)

[0] - [https://www.geekwire.com/2013/microsoft-updates-
photosynth-w...](https://www.geekwire.com/2013/microsoft-updates-photosynth-
ways-stitch-images/)

