
Tim Bray: Serverless Latency? - yarapavan
https://www.tbray.org/ongoing/When/201x/2018/12/14/SF-4
======
PaulHoule
(1) For some applications it is all about the "cold start" latency. I am
building up a "smart home" system and that involves the system doing quite a
bit a work ahead of time so that you don't wait 0.4 seconds for a list of
albums in your music system to display the first page, or for weather radar to
load, etc.

(2) There is a way to eliminate 20 sec waits at P100 and that is to say that
you are going to abort at 1 second, not start requests which are obviously
going to take >1 second, etc. This is a "path not taken" because most people
would rather deal with the long waits rather than the fails, but serving one
of those 20 sec requests can sometimes mean 100 other requests are late.

------
yarapavan
But my per­son­al fa­vorite choice for server­less com­pute is the Go
pro­gram­ming lan­guage. It’s got great, clean, fast, tool­ing, it pro­duces
stat­ic bi­na­ries, it’s got su­perb con­cur­ren­cy prim­i­tives that make it
easy to avoid the kind of race con­di­tions that plague any­one who goes near
java.lang.Thread, and fi­nal­ly, it is ex­ceed­ing­ly read­able, a
cri­te­ri­on that weighs more heav­i­ly with me as each year pass­es. Plus the
Go Lamb­da run­time is freak­ing ex­cel­len­t.

------
yarapavan
How to talk about it · To start with, don’t just say “I need 120ms.” Try
some­thing more like “This has to be in Python, the data’s in Cas­san­dra, and
I need the P50 down un­der a fifth of a sec­ond, ex­cept I can tol­er­ate
5-second la­ten­cy if it doesn’t hap­pen more than once an hour.” And in most
main­stream ap­pli­ca­tion­s, you should be able to get there with
server­less. If you plan for it.

