
Ask HN: What's your opinion about service meshes? - gtirloni
I remember there was a lot of hate for SOA&#x2F;ESB in the 2000&#x27;s.<p>Now with microservices, we&#x27;re seeing service meshes (Istio, Conduit, ...).<p>What&#x27;s different this time?
======
moxious
Some differences from my perspective:

\- SOA/ESB in the 2000s was driven more by business, less by available tech.
As an example of what I mean here, SOA folks would usually talk about
decomposing business process. E.g. Amazon sells something, there's a "shopping
cart" service, a "payment" service, and a "delivery" service. (I'm
simplifying, but you get the point) Modern microservies are way more granular
than that, and it could be that you have 5 tech-oriented services that compose
to a single sub-step of a business process. The original SOA idea had the
concept right, but was (IMHO) still too coarse to make the idea work.

\- Several enabling techs that came since then (containerization being an
example key one) didn't exist then, which made doing the same thing 5x more
painful. So the tech just grew up.

\- On the data side, SOA/ESB was driven by XML and XML Schema. As someone who
practiced a lot of that, it was really painful to use a document markup
language to do structured data exchange. XML was popular because in the age of
proprietary formats it was the first open, text-based, non-license encumbered
format. So don't get me wrong, there wasn't anything better _at the time_ ,
but that didn't make XML actually good for the job. Note this is __not a
comment that JSON is better __. In 2018 we 're in a world where open data
standards are the default. So it's not XML vs. JSON. It's XML vs. the entire
rest of the world you have so many options.

\- Performance improved, how many years of Moore's law, storage & memory in
between? This may not seem like a big deal, but it is. Since 2000 we gained so
much in computing power, that we can afford to slather on another 10 layers of
abstraction to make microservices easier on ourselves.

\- Software support improved. Go back to 2001 and scaffold a java app that
worked with WSDL and SOAP. Then go check out 2018's serverless framework and
scaffold a node.js serverless function. Back in 2001 you may have been
manually downloading JARs and putting them in a lib folder, then checking that
into cvs. In 2018 it's yarn install whatever, saving dependency structure (but
not binaries) in git. This amounts to hours of extra work the developer is no
longer doing. Greg Lemond, the famous cyclist, was inadvertently talking about
software when he said: "It never gets easier, you just go faster".

\- Infrastructure improved. As with the previous point, there are hours of
work you're not doing, that lets you focus on your microservice. For example,
almost no one who writes a microservice admins the server it runs on. Why
would you take such care to admin a single box? Just execute it and spin up
another. The average developers of 2001 would be mind-blown.

...But some things stay the same....namely there's nothing free in this world,
and everything's a tradeoff.

\- Service decomposition to the right level is still really tricky and people
regularly screw it up.

\- Being able to write your system in 10 different languages is nice for
flexibility of hiring, speed, and team independence. But then of course you
have to maintain 10 different languages worth of software.

\- We've traded debugging of simple stack traces in monoliths for ultra-
complex network inspection setups where we debug the failure to pass a
parameter to a remote function through a nightmare of complexity -- through
the tech stacks of both services, through the networking layer, through the
containerization layer, through the orchestration layer, etc. etc. Monoliths
were not without their charms.

